Shawn Ryan Show - #238 Sriram Krishnan - Senior White House Policy Advisor for AI

Episode Date: September 22, 2025

Sriram Krishnan is an entrepreneur, venture capitalist, and former senior product leader at tech giants like Microsoft, Facebook, Twitter (now X), and Snap. Born in Chennai, India, he began his career... at Microsoft before moving to Silicon Valley, where he contributed to product development at leading companies and later transitioned to venture capital as a General Partner at Andreessen Horowitz from 2021 to 2024, focusing on consumer and enterprise investments. In December 2024, President-elect Donald Trump appointed him as Senior Policy Advisor for Artificial Intelligence at the White House Office of Science and Technology Policy, tasked with advancing U.S. dominance in AI amid global competition. Krishnan co-hosted "The Aarthi and Sriram Show" podcast with his wife Aarthi Ramamurthy, interviewing tech leaders and exploring innovation topics. A prolific writer and speaker, he advocates for immigration reform to attract global talent, ethical AI development, and bridging technology with policy to foster economic growth. Shawn Ryan Show Sponsors: https://betterhelp.com/srs This episode is sponsored. Give online therapy a try at betterhelp.com/srs and get on your way to being your best self. https://bruntworkwear.com – USE CODE SRS https://calderalab.com/srs Use code SRS for 20% off your first order. https://meetfabric.com/shawn https://shawnlikesgold.com https://helixsleep.com/srs https://www.hulu.com/welcome https://ketone.com/srs Visit https://ketone.com/srs for 30% OFF your subscription order. https://moinkbox.com/srs https://patriotmobile.com/srs https://rocketmoney.com/srs https://ROKA.com – USE CODE SRS https://ziprecruiter.com/srs Sriram Krishnan Links: X personal - https://x.com/sriramk X official - https://x.com/skrishnan47 Website - https://sriramk.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Shri-Ram Christan. Am I saying that right? Perfect. Nailed it, Sean. Perfect. I'm sorry. But, man, we've become friends over the course of the year. And, man, I've just been super excited to get you in here.
Starting point is 00:00:20 I didn't think it was going to happen until, to be honest, until the next election. Because I know you're so busy with all the AI stuff within the White House. So I just want to say, man, it is. an honor to have you here. I've been looking forward to talking to you on the show about all the stuff that you're doing with the AI race with China and just AI in general, lots of questions throughout the country and lots of fear, lots of excitement, a mixed bag of emotions here, but like I said, I'm just, I am really excited. And I love your backstory. You know, we were talking at breakfast about the American dream and you came here from India and, and, and, and, and, and, and, and,
Starting point is 00:01:00 have been a part of many empires and you just you've done really well for yourself and your family and that's what i love to showcase here is the american dream and you are very much a big part of that of what that represents and all things are possible in this country so thank you man uh well thank you and thank you sean i mean it's an honor it's an honor for me to be here and i've watched you for I think over a year and a half, you know, when we first met up, and I've just been blown away by just, you know, the stories, you know, we both talk about at breakfast, so many amazing people, men and women who have just done some amazing things for this country. And I just kind of love the space and the environment you have created. So the honor is mine. I'm nervous. I'm intimidated
Starting point is 00:01:49 to kind of be in this room, which is kind of one of the, probably one of the badass, most patriotic rooms I've ever been in, by the way. Thank you. I'm super excited. Thank you. Well, if it helps, I'm also nervous. I get nervous for every one of these things. You're wearing a jacket. I feel so bad because I got to wear this, I got to wear a suit because it's part of the job.
Starting point is 00:02:11 It's part of the office. Right? And I'm like, man, I'm going to go to Shondra and look like a dark out there in a suit, right? Oh, no. I appreciate you. Like, you know, wearing the jacket. Thank you. Thank you.
Starting point is 00:02:24 But everybody starts off with an introduction. here. So here we go. Sri Ram, Krishna, appointed by President Trump, you are the senior White House policy advisor for artificial intelligence. You're a leader in Silicon Valley with a product, leadership, career spanning Microsoft, Facebook, and Twitter, helping shape everything from Windows Azure to Facebook's audience network and Twitter's home timeline. In 2021, you joined Andresen Horowitz as a general partner opening the firm's first international office in London focused on investing in AI and crypto. You grew up with humble beginnings in Chennai, India, where your love for
Starting point is 00:03:07 technology began when you convinced your father to buy you a computer. With no internet access, you taught yourself to code from books. When a Microsoft executive in India discovered one of your blog posts, you were invited to interview an opportunity that marked the beginning of an exciting professional journey in the United States. You also co-hosted the Arthi and Sri Ram show with your wife. The show grew from a clubhouse talk show into a widely downloaded top tech app and top tech and business podcast. Today at the White House, you are a key architect in America's AI action plan, leading efforts to extend the U.S.'s global dominance in AI. You have also participated in high-level diplomatic efforts,
Starting point is 00:03:54 including AI delegations to Paris and the Middle East, advocating for the global usage of U.S. AI tech stack. Most importantly, you're a family man. You've been together with your wife for 22 years, and you even hosted a podcast together,
Starting point is 00:04:10 and you're a huge pro wrestling fan. So we'll get into all that, but like I said, I want to do your backstory. you're coming to the U.S., how you got into the tech, which we just covered a little bit. And then everything you're doing now with AI. So, like I said, a lot of excitement, a lot of fear,
Starting point is 00:04:33 a lot of mixed emotions about it. So it's going to be an amazing interview. Thank you. And thank you for that. That was super kind of you. Thank you. Oh, my pleasure. My pleasure.
Starting point is 00:04:43 So we got a couple of things to crank out here real quick. All right. One of them is I have a Patreon community. And it's a subscription network. work, but they've been with me actually before I even started the podcast. Then when I did the podcast, started it in my attic and now here we are in a brand new studio. And so one of the things that I offer them to do is I offer them the opportunity to ask each and every guest a question. So this is from Andre. What emerging technology or trend do you believe will have the most
Starting point is 00:05:18 transformative impact on society in the next five to ten? 10 years? And how can policymakers and innovators collaborate to maximize its benefits while addressing potential risks? Well, thanks, Andre. I think, you know, sort of the obvious answer for me is all things AI and artificial intelligence. I think if you look at the last 60 to 80 years, there has been a few huge technology trends. In the 60s and 70s, the first microprocessor was invented.
Starting point is 00:05:51 in Silicon Valley, in the transistor and the microprocessor, which kind of powered Intel, the Moors Law, and things which power every single phone, every single laptop. Then in the 90s, you had the internet, starting with dial-up internet, using the phone line, which is kind of how I started. People of our age group probably started.
Starting point is 00:06:16 And then of course, you don't talk of logging off anymore. The internet is just everywhere. So I think that was transformational for society in some really good ways and some maybe questionable bad ways. Then you had the mobile phone happen in, I'm guessing, say, 2008 when the iPhone came out and people just moving everything onto their phone. But I don't think AI now is the next big platform.
Starting point is 00:06:41 And we are in the early days of it, right? I would say the AI revolution, you know, we can go back to the history and we can get into that, but I think really started with launch of ChatGPT about two and a half years. ago where it sparked captured people's imagination. And now we're just starting to see, wow, what can we do when this thing is advancing so quickly?
Starting point is 00:07:00 So I think AI is the game in town that I think is most important. I think crypto is also really interesting, but I really spend a lot of my time thinking on AI. And I think my job and working with others in the administration is to harness it for the American people. How do we make sure that it makes every individual's lives better? whether it's like, you know, dad, you know, trying to make sure they provide for the family, somebody trying to teach their kids, whatever it is, we want to make sure it makes their lives
Starting point is 00:07:30 better. At the same time, you know, we're also in this very intense race with China on all things AI, which we can get into in detail. So I think the way I see my job is, one, we need to win the AI race with China. I think the consequences are catastrophic. if we don't. And second, equally important, is making sure, like, AI benefits every single American. Like, every single person watching this, Andre, who asks the question, my hope is,
Starting point is 00:08:03 when my time the administration is done, they are like, OK, AI is helping me in my life just a little. But that's the goal anyway. Yeah, you know, I feel like with AI, this is, I mean, we were talking about it at breakfast, but you mentioned this like when the automobile wasn't done enough. I feel like this is, this is,
Starting point is 00:08:20 a moment in humanity like when the wheel was invented. Everything's going to change. And you know, what I love about our conversation at breakfast and what I love about you is, you know, you understand the importance to win the AI race against China, but you also understand the balance that needs to happen within, you know, within not just the U.S., but within the world. And there is a lot of fear of AI is going to take all of our jobs. AI is going to, you know, crush everything.
Starting point is 00:08:49 And you are the guy that has to navigate through all of that and make sure it benefits American citizens and humanity as a whole while simultaneously winning the race against China, which is no easy to ask because China plays by damn near zero rules. And so it's going to be a fascinating discussion. Yes, it is. It's super fun. But got a couple gifts for you. Oh, wow. Okay. Yeah.
Starting point is 00:09:17 one This is the part I was looking forward to the most I have no shame I want the gifts There we go Those are
Starting point is 00:09:25 Vigilance League Gummy Bears Made here in the USA Legal in all 50 states There's no THC Or anything like that In there Can I eat some
Starting point is 00:09:33 Yeah Yeah Let's do it Dying to know What you think here Okay One out of five Five being the best
Starting point is 00:09:45 I'm gonna say five Nice Wow. Good answer. Good answer. And I got you another gift. Oh, man. So I think you're really going to like this one.
Starting point is 00:09:57 Okay, wow. Here you go. So I got a friend over it. Sig Sauer. Do you know what Sig Sauer's? His name's Jason. I told him you were coming on. He's trying to figure out AI.
Starting point is 00:10:10 So he's really looking forward to this episode. But he wanted me to give you this. Wow. But hold on. How do I get this open? here we go oh my goodness okay what am i tell me explain this to me all right so that is the you want to hold it up yeah it's unloaded i'm going to teach you how to use that yeah maybe you can teach me some ai stuff and i'll teach you out of show yeah but um so that is the sigs sour
Starting point is 00:10:35 21 11 gt o 9 millimeter it's got 21 rounds in the magazine plus one in the pipe so 22 round capacity. It's got that, see those slits in the front? Yeah. So that is to help with with recoil management. So it's going to keep your gun down when you shoot. It'll minimize recoil a little bit. Keep your gun a little bit flatter. They're redoing their whole optics line. That's the latest red dot. So using a red dot makes shooting a lot easier to hit the target. And so in the gun industry, there's this rage about these 2011 pistols. Everybody wants a 2011. Pistol. And Sig was, I don't want to say late to the game, but everybody's been really excited
Starting point is 00:11:20 about what they're going to release in the 2011 series. And so that is their first model that just came out. I'm going to ask you a favor. After we're done, you want to show me the ropes and how to use this properly. Hey, I would love to do it. Let's do this. You know, one of the things as a federal, this is gorgeous, so thank him for me. This is gorgeous.
Starting point is 00:11:41 And as a federal employee, you know, you've got to declare. You've got to declare, like, every single gift you get. And this is going to be, I think, probably the most interesting declaration for sure, you know, when I file in the paper. But, no, this is gorgeous. So thank you. And you're going to teach me how to use this. I will. We'll do it on a break.
Starting point is 00:11:57 There we go. All right. Now, before we... Now, I have a gift for you. I love gifts. And thank you. Just keep this right here. Now, outside of...
Starting point is 00:12:13 I would say technology. I would say one of the most important things in my life is pro-professional wrestling. I got hooked to it as a young kid, started watching it. I think it was my introduction to America. It was my introduction to sort of sports and entertainment. It's been a huge part of my life. So I got you. This guy.
Starting point is 00:12:41 No way. here. Hold on. Okay. So this is a WWE title belt. It's not just any WWE title belt. You know, fans who are watching this and know this is known as the Winged Eagle WW Championship belt because, or back with WW.
Starting point is 00:13:05 So in the 90s, right? Like they had, like, WWE has had, WWV back then, has had many, many championship title. bells, but connoisseurs or fans might say, and there may be some contours here on this, the best one of all time was this guy right here, right? It has been held by greats like Brethaheadman Hart, who's my favorite guy growing up, The Undertaker, Sean Michaels, a lot of greats. They actually had a comeback where Cody Rhodes worked recently. It's a core, core part of my childhood. I had it on T-shirts, everything else. And this guy, I'm holding in my hand, when my wife and I had this podcast for the last four years
Starting point is 00:13:47 until this job, if you watch episodes, you'll see a title behind me on the shelf. And so I grab that. That's it? Oh, yeah. Oh, man. So, you know, this is like some classic pro wrestling history, like right here. There you go. Oh, man, I love this.
Starting point is 00:14:04 There you go. Wow. Thank you. That is amazing. So that'll be displayed to the studio. from here till it's over. Yeah. Well, Sean's title range starts right now. Did you watch
Starting point is 00:14:20 Burris Ling much as a kid now? I watched, actually I watched it in the 90s. Okay. Who was your favorite? Who's my favorite? I don't know. It was probably I went back and forth, but I really liked Hulk Hogan and the Ultimate Warrior. R.P. Hulk Hogan, Terry Bulli, Ray Bauer, Sean Michaels, Jake
Starting point is 00:14:40 the Snake, all of them. What is your favorite match that you remember? You know what I always liked was the Survivor Series Cage Matches? Yes, yes. Which one? You remember, like, for example, like, well, there was a SummerSlam one reading Brett and his brother Owen, but which one? I don't remember exactly which one, but those were always my favorite when they would all show up.
Starting point is 00:15:05 Yeah. I think when I was a kid, I loved those, and I don't know what I was watching. It was like, you know, these large-than-life characters. You had Hulkogen, eat your vitamins, say your prayers, strain every day, you know, the red and yellow coming out. My guy was Brett Hart. So, Brett Hart, do you know Brett? Oh, yeah. Brett the hit Brent Hart.
Starting point is 00:15:28 I would say if you ask people for the Mount Rushmore professional wrestling, Brett will always be in there. He was one of the most technically savvy wrestlers of all time. fantastic at telling stories with his body and in the ring. Even if you watch his matches now, it is just so incredibly crisp and smooth. And I know we're talking breakfast. You mentioned Stone Cold's, you know, Steve Austin. Brett was involved in maybe one of the most important matches of Stone Cold's career, right?
Starting point is 00:16:02 Like the I quit Dresselmania match. Have you seen this? No. Oh, okay. So this is great, right? Like, so I forget which year. I'm going to say 96, but I may be off. But so do you know what a heel and a face is in pro wrestling?
Starting point is 00:16:21 Mm-mm. Okay. So in professional wrestling, right, by the way, folks, spoiler, it's not real. It's scripted. And it's K-Fab, as they would say. K-Fab is sort of the reality inside professional wrestling. And inside K-Fab, a face or a baby face. face is the good guy. And the heel is the bad guy, right? Like, so Hulk Hogan, right, for years
Starting point is 00:16:47 was always the face. The red and yellow, right? He slams, slams Andre, right? Like, you know, he was the face. And then the heel would be the bad guy and the heels would do evil conniving things, you know, they would, you know, do a low blow or throw something in your eye or, you know, it's one of those things where they kind of cheat to it. Those are bad guys. And in the 90s, what was happening was the W.W.A. was kind of going, maybe along with pop culture, was going through the shift where often sometimes the bad guys would start to get cheered a little bit more. There were these anti-authority figures and, you know, and the good guys, what people would call the classic white meat baby face, right? Like, we're not getting enough cheers. And so in 96, you know, Brett, you know, was, I think, was the, I'm not sure whether he was the champion at this match. But Brett was a good guy. He was a face. He was one of the lead faces at the time.
Starting point is 00:17:43 And Stone Cold Steve Austin was kind of coming up then was a heel. He was a bad guy. And they kind of set out this match. It's called the I quit match. And the idea behind this was that, you know, the only way for you to win is the other guy has to say I quit. No tap outs. No counter.
Starting point is 00:18:04 Roger would say I quit. And I think the match was in Chicago. It's WrestleMania. But the WrestleMania is sort of WWs. you know, big extravaganza. It's like the Super Bowl. It's a big event. There's been, I think, 41 so far. And so they have this guest referee, Ken Shamrock, who's from, I think, a mixed martial arts background back then. So, and Brett and Steve had this amazing idea, right? So what happens is they fight all over the ring, all over the crowd. It's bloody, you know, Steve Austin,
Starting point is 00:18:35 you know, he starts bleeding, uh, cut open the hard way. It got juice, as the wrestlers would say. And at the end, you know, Brett had his finishing move, the shop shooter, right? And, you know, and the shop shooter, once you're in the shop shooter, when I was a kid, right, like, even though you had this thing, we were like, do not try this at home, right? Like, when my cousins were over, like, I'm putting them in the shop shooter. Kids, don't try this at home. And but at the end of the match, you know, you know, Steve's in this bloody mess,
Starting point is 00:19:05 Brett puts him in the shop shooter, right? And there's this iconic image of Steve just yelling out in pain, blood in his face, but he's not saying I quit. He's not saying I submit. And Ken Shamrock, you know, it's like, do you quit Steve? And he's like, no, right? And after like minutes of Steve in agony, he refuses to quit and he passes out. And, you know, and they end the match there. So that match was crucial for both their careers because it's one of the rare matches where they did.
Starting point is 00:19:37 what was known as a double turn. So a turn in pro wrestling is when a good guy becomes a bad guy, like a face becomes a heel, or vice versa. And in this match, because Brett would not let go off the hold, and he would then push Sam Rock, he kind of became a bad guy. At the same time, because Steve did not give up. He was bleeding, he was in pain, he never said I quit. He became a good guy.
Starting point is 00:20:04 And it launched Steve's career. So that is sort of one of the, and people ask you for their like top five matches in history, like that's like one of the top ones. So anyway, so Brett, so that's a great part of the W.W. History right there. Man, thank you. This is, this is a big, totally unexpected.
Starting point is 00:20:20 This is awesome. I was excited. I was excited for this. You know, wrestling has been like a big, big part of my life. I've learned so much from it. You know, I've learned storytelling. I've learned, you know, what connects to audiences. Because a lot of people think about it, and they go, oh, it's just like grown men in spandex,
Starting point is 00:20:40 like throwing each other around, right? Or they say what you did, which is like, I used to watch it as a kid. And I used to watch it, it's great. You watch it, it's super fun. I try and getting my kids to watch it. It's great. It's spectacle, good guys, bad guys.
Starting point is 00:20:52 They're larger than life. They're huge. They're characters. But as I got older, right, I saw the sophistication and the art form, right? Because number one, it requires serious athletic ability. These guys and women, you know, are incredibly athletic. They're taking real risks, right?
Starting point is 00:21:14 Like when they jump, you know, off the corner and onto a table, well, that's a real table, that's a real corner, and people have hurt themselves and, you know, some really bad thing happens. So they're taking real risk, right? And then they're trying to tell a story. And instead of a story, which is through CGI and dialogue, right, it is a story they are telling with their bodies, with some, you know, promos and dialogue, but also with the audience together. Okay. So wrestling only works because the audience is in there with you.
Starting point is 00:21:47 And often the greatest wrestlers know to change what they are doing to get a different reaction or sometimes change something what they're doing on the fly because of the audience. And there's a famous example of this. You know who the Rock is, obviously. Oh, yeah. Okay, so, by the right, we're talking about, yeah, we want to get a wrestling first, okay? And so, but one of the greatest WrestleMania matches for all time was in WrestleMania 17 in Toronto. And the Rock is a good guy. And Hulk Hogan, and the Rock's a young guy, I think it was maybe late 20s or 30s.
Starting point is 00:22:20 He's kind of the peak of his young career. He's a great guy. And Hulk Hogan had come back as a bad guy. And I think Hulk Hogan was maybe in his late 40s. He's kind of in the, slightly the tail end of his career, right? Like, he was not the red and yellow. And this is WrestleMania, right? And, you know, you should watch it on YouTube.
Starting point is 00:22:38 They come to the middle of the ring. They look at each other. And, you know, it's like light bulbs going off everywhere, like 100,000 people flashing. And, you know, and this turnaround, look at the audience. And people like, oh, my God. You get Hulk Hogan one of the greats, the greatest of all time, slammed Andre, the giant. And then you have the people's champ. you know, the rock, like.
Starting point is 00:23:00 And so what happens, remember the rock went in as the good guy, Hogan, it's the bad guy. Within the first 30 seconds, right, it becomes very clear that the crowd doesn't care who the good guy or the bad guy is. They wanna cheer Hulk, they wanna cheer Hogan. They wanna cheer their childhood hero, right? And one of the amazing things which happens
Starting point is 00:23:20 is in the first 30 seconds to a minute, the rock does good guy news. If good guy is supposed to do a, work in a certain way, right? You do legitimate moves, you don't cheat, etc. The bad guy works sitting in a particular way. But these guys, without even talking to each other, they realize, oh, wow, like, this is something different. And they call an audible. And they, without even talking to each other, they say, we're going to flip the story, right? And for the rest of the match, the rock, right, acts physically as the bad guy. And if you watch it, you can see that
Starting point is 00:23:52 happen, right? And there is this amazing moment which happens where, like, about near the end of the match, the rock hits Hogan with his finisher, you know, the rock bottom, and Hogan kicks out, and he huls up. He's like, and he's sucking in the Hulke-Maniac, sucking in the energy. And people, like, connected to their childhood. I remember watching it. I was connected to my childhood, right? People like, oh, my God, Halkman is right?
Starting point is 00:24:16 No, the rock eventually, you know, winds up winning. But it is one of those amazing moments. I was watching it decently because, obviously, tragically, Hogan passed away recently. So I went back and watched a lot of these old matches. And it's one of these amazing things where you're like 100,000 people, right, feeling something together to iconic superstars, right, like knowing how to navigate them. And in the fly, without even talking to each other, redoing the storytelling and creating, you know, something which any wrestling fan would tell you, if you're watching that and if you don't have goosebumps, like, you know, something's wrong with you. And so anyway, so I'm a big fan of wrestling. I still watch it now.
Starting point is 00:24:58 I'm fortunate it would become friendly with some of the people who do it for a living. I have a lot of respect. I learn a lot from it. Interesting. Interesting. You know, we have a saying here, it has or as the whole team. We compare U.S. politics to professional wrestling. That is true.
Starting point is 00:25:16 Do you see any similarities? Absolutely. First of all, the person I work for, the President of the United States is in the W.W.E. Hall of Fame. Okay, so let's just start there. And I think that, you know, a lot of people in politics and other parts of the world, other domains, learn from pro wrestling. For example, right? You know what cutting a promo is? No. Okay, so cutting a promo is when, let us say, you and I, I'm a good guy, you're a bad guy, and we have this big match coming up this weekend, right? And we want to get everyone to buy it. Back in the day, you would pay 60 bucks.
Starting point is 00:25:54 Get it on pay-per-view. These days, you probably subscribe on Netflix or something. A promo is us trying to hype up the match. And you'd be like, I'd be like, another time, I'm a good guy. Sean, I respect you for everything you've done. But this Sunday, you're going down, right? Or something like that was a bad promo, right? Like, don't judge me.
Starting point is 00:26:13 But you kind of build up to it. And you basically put over, the other person's kind of put over, meaning you make sure the other person looks a legitimate threat. because nobody wants to see you fight someone who's not a legitimate threat. So you've got to put over the other person, right? And then at the same time, you're trying to build interest, you know, for this match, right?
Starting point is 00:26:33 And I think, you know, if you look at a lot of people in politics, they'll learn from that. And even outside politics, for example, Floyd Mayweather, money Mayweather, he took a lot of how we constructed the TMT, the money character from wrestling, right? Connor McGregor. Exactly, right? The walk, the whole thing, right, like a lot from pro wrestling, kind of, because people want to be invested.
Starting point is 00:26:58 They want to see the story, you know, they want to be invested in you, right? They want to see you kick someone's ass or see your ass get kicked because, you know, you're the jerk, whatever. But ultimately, you're trying to get people to invest in a story and I guess, you know, watch the fight. And I think some of the best people to do it, find a way to do it where you're like, man, I'm going to watch history. And I've got to watch it. So, and maybe I'll stop at this. People ask me, you know, if I'm going to new to pro wrestling, right? Okay, you're going to teach me how to use a 6-R.
Starting point is 00:27:30 I'm going to get you back into pro wrestling. Right. And the match of, I'm going to have two matches. I'm going to have you watch. And these are from 2009 and 2010. And they're Sean Michaels and The Undertaker. Okay, two matches in a row. So the quick story there is that Sean Michaels and The Undertaker,
Starting point is 00:27:48 I'm sure you know who they have seen them growing up. Oh, yeah. Right? Now, The Undertaker had what was called the streak. And the streak was the sequence of matches at every WrestleMania that he had won. He was the only person, and at the time, I think he had won maybe 11 or 12 years in a row. Imagine one football team winning every Super Bowl for 11 years in a row. He eventually wound up getting the streak broken, I think, at 18 years.
Starting point is 00:28:17 But at the time, the streak was this magical thing, you know? It's like, next person up, who's going to take him down? You know, he doesn't get taken down. Sean Michaels, right? You know, Mr. WrestleMania, right? Like, he was somebody who's iconic moments, went up against Undertaker. So, WrestleMania 25, right? I think this was 2009.
Starting point is 00:28:34 They have this epic contest, right? Just epic contest. One of the greatest matches of all time, right? You watch it, right? You know, just a storytelling they do. 100,000 people, again, invested. You know, I get goosebumps from just talking about it, right? Now, amazing match.
Starting point is 00:28:52 Now, what happens after that, Sean Michaels, basically the character goes crazy. He's like, I came so close, and I couldn't get it done. And so he challenges the Undertaker again for a rematch at the next year's WrestleMania, a year later. And Taker says, no. And Sean just does a lot of these things to get him to say yes. And so he goes on. And Takers say, okay, fine, I'm going to give you a rematch, but it's not going to be any match. you know, if you win, you're going to break the streak.
Starting point is 00:29:20 Nobody has ever done this before. But if I win, you're going to end your career. Okay? So, now, you can think of this arresting storyline, right? It's like two people in a room wrote it up. On the other hand, it is real. Because we grew up with watching Taker win every match for, through our childhood. It is a part of my story growing up, right?
Starting point is 00:29:43 It is part of history. At the same time, we grew up with Sean Michaels, right? Like, I loved Sean Michaels, I hated Sean Michaels, I loved Sean Michaels again. So you knew when you watch that WrestleMania, something beautiful, which was real, was going to end that evening, right? When somebody counted one, two, three. And anyway, so you should watch him at the end of the match, you know, it is huge and emotional. And Sean is beaten down, and Taker tells him, you've got to stay down, man. Like, you'd be, I'm just, you know, it's too much.
Starting point is 00:30:17 You got to share it on. And Sean gives him this throat slash gesture, which basically says, like, you got to put me down. Otherwise, I'm going to keep coming back. And then Taker does this big move wins. And then there is a sense of sadness there because, you know, you're like, wow, this amazing match ended. But Sean's career did really end. He retired after that match. And it was one of those sort of this.
Starting point is 00:30:44 epic, mythological feelings, like these two amazing gladiators who have never been beat in Phryan So anyway, so when I tell people to watch wrestling, like, you know, you should watch that. You should watch Rock vs. Hogan and it'll make you feel something. I will. I will. Incredible. Let's move into your story. Yeah. You ready?
Starting point is 00:31:05 Mm-hmm. All right. Where did you grow up? I grew up in India in a city called Chennai. It was called Madras when I grew up. India has four major cities, Delhi, Mumbai, Kolkata, and Chennai, and we are in the south. And, you know, what I would call, what Indians would call a lower middle class, middle class upbringing. My mom stayed at home, you know, took care of the kids.
Starting point is 00:31:36 She was very focused on family, very religious in a lot of ways, very focused on just raising us in sort of the proper way with a lot of right values. My dad kind of had the same job for 40 years. He worked in this nationalized company. And we're one of those people where, you know, I think my dad, when you're a kid growing up, you don't really think about how your parents are acting towards you. It's just like the way they are. But looking back now and especially, you know, we were just kind of talking about
Starting point is 00:32:13 sort of our parenting journey in a way. I see how much they gave me in so many ways. So my dad, we never had like real money of any kind, but he always made sure that, you know, we were comfortable and that, you know, we never felt like, you know, we were left out. But I now know just knowing some of the numbers, like that must have not been super easy.
Starting point is 00:32:41 I also really respect now, because I took for granted then. He passed away. He died in 2006, but he was just there all the time. Took me to school every single day, came back from work, a long day of work, hung out with us every single day. And back then, you know, just the way things,
Starting point is 00:33:02 you're in your kid, you're 5, 7, 8, you're like, that's all everyone else. But now I realized how fortunate I was to kind of have that stable, grounding experience. And my mom was just sort of this huge source of strength. You know, she was taking care of us without a lot of resources. And, you know, she was incredibly focused on my education. Because when she grew up, you know, they didn't have any access to books or, you know,
Starting point is 00:33:32 they had really struggled economically. And she was like, I'm just going to make sure you had a good education. And she would save up money. And, you know, I used to get really into reading. a lot and should make sure I could always buy things. And again, at the time, I didn't really appreciate all the sacrifices they were going through to make that happen for me. But I'm just very, very grateful for the experience I had because I think, like, that
Starting point is 00:33:57 grounded me, taught me, you know, what being a great parent who's dependable, who's always there, you know, looks like. And I've been so much more fortunate in so many ways. but yeah, that was a, I would say I had a pretty great childhood. Yeah, you know, I watch your socials and, I mean, it's just, it's really, it's nice to see somebody in the position that you're in that is so focused on family. I see all the stuff that you're posting with your kids and your wife and, and you take your family time very seriously, but, and I commend you for that.
Starting point is 00:34:33 I think that, I think you're a very positive example of what it means to be a husband and a father and a family man and um you know what one thing i would like to talk about with you is you know this i mean you're raising your kids here in the united states you've done very well for yourself and that is a i probably shouldn't assume anything but i think that looks very different to than how you grew up in india and so could could you go into you know some of the differences or what what your childhood was like growing up in india uh Well, very obviously, I've just been very fortunate here, and I think, you know, my kids are very young, so I think there's very, very different childhood from what I had. But let's see.
Starting point is 00:35:19 So, you know, we, you know, India has this term, which I don't think is asked from the United States, called middle class. And, you know, in my family, education was a huge priority. you know, they kind of, I think my family kind of pinned all their hopes and dreams on me in a lot of ways. They were really focused on education, and it was very high pressure. If you didn't do well academically, you know, you were really going to struggle, or at least that was what I thought. No, I don't think this is actually true, and I've seen a lot of folks kind of, you know, have great successful careers and whatnot. But that was what I was told, like, this is the way out. And when I was 18 or 19, you know, I convinced my name.
Starting point is 00:36:03 dad to buy me a computer, one of these old, again, I'm dating myself here, a Pentium 3 box. Back when a CPU was one of these big beige boxes that showed up and he stuck it on a table. And it was a big deal for him. I recently reconnected with somebody who knew him at the time, and it cost him like a year's paycheck or something. Wow. This is a big deal. right? And again, my kids, I'm like, you guys have it so easy, right? Like, you know, but, you know, it's a big deal for him and had begged and pleaded for a long time. Now, again, he was a lot of great things. He was a rebel in a lot of ways. I get some attitudes from him, but he didn't really understand technology. It was like he was not of the generation, right? He just didn't understand computers. But he kind of, but he took a bet, you know, he spent this, you know, serious amount of money, you know, on me. And then I, even though he didn't know what I was going to do with it, And then I convinced him to get dial-up internet, which was kind of a big deal.
Starting point is 00:37:06 And I would say, like, my whole story exists because of that and America. Because, like, you know, well, and I guess pro wrestling, because that's kind of how I grew up, like, really, you know, understanding America in so many ways. But, you know, I grew up, you know, I would spend every single night learning computer, and learning to write code. And way back then, it was a bit hard to kind of get, you know, you can get online and you run up the phone bill. There's a concept which kids these days don't know, which is you log off the internet. Do you remember that when you used to log off?
Starting point is 00:37:40 I do remember. Yeah. Do you use to like run up the phone bill back home? Oh, yeah. Yeah. And people like, Sean, get off the PC. You know, we have to make a phone call, right? Like, oh, you're like, you're doing, you have a big download and then somebody picks out the phone.
Starting point is 00:37:53 Anyway, so, kid these days. But I would have to do it late at night because that is when the internet would be faster because during the day other people would be using these phone lines that would be slower and I would get all these kind of coding guides
Starting point is 00:38:11 I would get all these used books on writing code and at the time I was a bit lost I would say in what I wanted to do with my life and how I fit in I wasn't very social and I was kind of lost. But then I realized that this code was something that brought me joy. And one of the things I think people who don't do computer science don't understand is the deep joy of creating something on your computer with computer science code, right?
Starting point is 00:38:41 Because the computer is unforgiving. You have to figure out a way to get it to understand you. You have to solve mental, intellectual problems. you never get it right in the beginning. But AI is now helping with that, which we can get to later. And it is an intellectual exercise. And once you get it working, it's just amazing. You just feel so good because you've created something
Starting point is 00:39:06 and no one can take that away from you. So I was just doing this every single night. I would stay up from like 10 p.m. till 4 a.m., you know, and then go late to school the next day. I would just be writing code online. And this is right, I think, too amazing. amazing things, you know, wound up happening. One is my now wife, Artie, you know, she was very similar to me because, you know, she came from, you know, she was kind of one of the first
Starting point is 00:39:33 people in her family to go to school in another city. She was one of the first people to be kind of good with computers. And she was in a different city and she was learning how to write code herself. Now, at the time, I'd kind of built a little bit of a reputation myself in my town as the computer science guy. I'd written some piece of code. Do you know what a virtual machine is? A virtual machine? Yeah.
Starting point is 00:39:57 No. Okay, so a little nerdery here. If you're going to nerd out for a bit. So people usually write, manipulate computers using programming languages. You might have heard of that, right? And, but, you know, but one of the things people figured out is if you directly have a programming language access the computer, it might be unsafe. or it might be sort of hard to manipulate in lots of ways.
Starting point is 00:40:25 So they came as an idea of a virtual machine, which is a computer that runs inside a computer. Okay? And the reason you do it is you're not giving it access to all of it. It's a sandbox, okay, which runs in a computer, but you can run any sort of code on. You can write a game, you can write all these amazing things on it. Now, it's a very sophisticated piece of technology
Starting point is 00:40:44 because you've got to be really fast, because you've got to run it like a computer, you know, you've got to know all that. You've got to know all the things the computer does, and you've got to make sure it doesn't sort of break out of the sandbox and then does maybe evil things on the actual computer underneath. And for a lot of reasons, I got really
Starting point is 00:41:01 into building virtual machines and how they work and how to make them fast, how to make them performant and all these kinds of things. And I kind of had a little bit of reputation. Think of the guy in your high school who can maybe dunk or athletic and the schools around him. I was kind of like that. guy. And so the wife and I, you know, we started chatting online because somebody had introduced
Starting point is 00:41:24 us, say, hey, you guys are these two nerdy computer science kids who seem to know how to do stuff with computers and can you help me. And at the time, I didn't even know she was a girl. I didn't know what her age was, but she was like, I was just this friend who has similar interest to me. So we would chat every single night. We would stay up night chat chatting. And then after six months, you know, she's asking me, hey, you know, like, who are you? I'm chatting with you, who are you? And I'm like, oh, I live in the city, which is kind of nearby. And then we wound up meeting a few months after that. First time we met, and we've been together ever since.
Starting point is 00:41:58 This was for the last 23 years now. 23 years. Yes. 23 years. When did you get married? We got married in 2010. But we met each other in 2002, started dating in 2005, 2006. Then in 2010, our parents were like, you,
Starting point is 00:42:17 folks are both crazy, you'll never find anyone better than each other. You know, you've got to get mad at and, you know, got married in 2010 and I have two kids, the whole thing. So, yeah, so I always have to say the best thing computers have brought me is Artie, because without that, you know, I wouldn't have her. I wouldn't, you know, and none of this would be possible. The second thing which wound up happening is that I was writing all this code. And at the time, you know, there was a Microsoft executive who was touring India and they wanted some student who could do some of the things I was doing. And somebody had written, seen something
Starting point is 00:42:51 I'd written online, and I get this cold email saying, hey, do you want to come out and do this little event for us? Because they wanted a student to come do a demo with this kind of this big shot, exact. And at the time, I couldn't really speak English. I was kind of living in my bedroom, doing these things on my computer. And this is like, this is amazing.
Starting point is 00:43:14 So I remember I got on my very first plane ride ever. I've never been on a plane before. And my first fancy hotel room, and we were like, I'm guessing, I'm maybe 20 at the time. It's like crazy. In this fancy hotel room, Mike's are paying for it. And it went really well. And this executive, you know, who's not retired, but it was amazing, was like, you kids, he met me and my now wife, Arthur. He was like, you kids should work at Microsoft. I was like, sir, we would love to, you know, but we are here. We don't even know how to find you. So, you know, long story short, you know, he said, you know, he said, let's find a way to get you in,
Starting point is 00:43:50 because he was kind of impressed by all the things I was doing. And it was very connected some of things Microsoft was also working on at the time, like cloud computing. So he was like, he does this kid who's doing these things that we are also working on. Let's just find a way to connect the dots. And so a year later, they flew me to Seattle, Redmond,
Starting point is 00:44:09 where Microsoft is off. And I remember flying in, you know, and Seattle's a beautiful place, you know, the Pacific Northwest is beautiful. And I just fell in love. I was like, man, this is all I want to do. I've grown up, you know, watching Hulk Hogan, watching The Rock, watching Star Trek, watching these movies, and I'm now here.
Starting point is 00:44:29 And even though I couldn't really speak English very well, even though I didn't really fit in very well, I was like computers I can do and let me figure this out. So that sort of, you know, started my whole journey. So I think without sort of the, my parents, sort of the environment I had, without like my dad spending, taking this bed on me. You know, he was not sure what a teenage son's on the internet late at night, right? He was like, I don't know what's happening there in that closed door. Like, I don't know, right?
Starting point is 00:44:55 Like, you know, the door is closed. But they took a bed on me. And one of the things you realize you get older is how fortunate you are, right? So without my parents, you know, being so focused on me and making sure, I had a better life than they had giving these opportunities without these people who just saw me and they were like, hey, I see something in this kid. Let's take a bet on him.
Starting point is 00:45:28 And I would say, mostly I would say this country, right? Every single thing I have professionally, my wife has been possible because of America. We live in Seattle, in San Francisco, and I live in Washington, D.C. So I'm so grateful for all these things, and I've just been very fortunate, right? I've been in the right time, right place,
Starting point is 00:45:52 and I've been very, very fortunate. Yes, yes, you have. And wildly, both you and your wife, are they wildly successful. Are your parents still alive? No, my father died in 2006, and my mom died three years ago. And I had sort of, I think,
Starting point is 00:46:13 one of the things you don't realize, so 2006, I was, what, 20 to 23 when my dad passed away. It's kind of a big regret in a way because I never got to show him some of the success I wound up having later. Like, I think, you know, a lot of ways you want to show your parents that, oh my gosh, mom, dad, like, look what I've made of myself. Look what I've become. And at the time, I had just gotten this job and he knew it was a big deal because it doesn't I was doing these cool things and I was in the newspapers and whatnot, but I never kind of got to show him that, man, like, all the investments that you made in me, right? Like, you know, look, I've done something with that. I also, I kind of missed out on having an adult relationship with him, which I think is quite important. I do think, you know, I grew up in my 20s with my dad. So, and I always look, you know, a lot of people don't have parents. all or, you know, they didn't have the benefit of the childhood I had, but I sort of feel like I missed out because I never, I was a kid growing up, I had, you knew your dad in that
Starting point is 00:47:23 facility, but I, I'm saddened me that he never saw me, you know, you know, have my career, and I saw me get madded, have my kids, so I never had sort of an adult relationship with him. I never, I don't got a sense to provide for him, you know, I would have loved, so he, he was a rebel, by the way, so he, back in the community, I grew up. in, right? It's very community-oriented, et cetera. But you're kind of supposed to stick to your lane, right? But he was a bit of rebel. He was like, I want to be a writer. I want to write my own books. He wrote, he was kind of, he had all these creative ideas. I think if he was just, if he didn't, alive five years later, he would have self-published on the internet and, you know,
Starting point is 00:48:03 probably, you know, yeah, you know, being on YouTube comments with conspiracy theories, he was that person. But he missed out, right? And, but I think I have some of that nature. in me to be like to speak out. So one of the things he really taught me is that, you know, a lot of people would be like, you know, if you see something wrong, you mind your own business. But if you saw something wrong in our community,
Starting point is 00:48:26 he'd be the first person to go speak up and try and help one. And as a kid, you're like, man, like, I don't want to get into this trouble. Like, what is this? Like, I don't want to keep my head down, but he was always good at that. So I miss him.
Starting point is 00:48:34 I feel sad that and get the chance to kind of show him everything and maybe provide for him. My mom, you know, she passed me three years ago, but, you know, I'm very fortunate and she got to kind of see my family, you know, she built a very close relationship with my wife, you know, my first kid, and yeah.
Starting point is 00:48:50 What would you, I mean, what would you say to your dad if he was here today? Man. Think about this. that I somehow made something of myself. I don't know I'll say it or show it to him, but, you know, like a lot of parents, you know, he was very proud of me, but, you know, sometimes he would have friction and he'd be like,
Starting point is 00:49:26 and I just want to show him that in the very imperfect way I am now, you know, I've sort of made something of myself, right? And a lot of it thanks to him and a lot of the bets he took. and, you know, my hope is that, you know, somehow, somewhere he knows that, but, you know, that's the one thing I always feel like I never got a chance. You know, you want to, you know, you want to buy your parents cool stuff and you make some money, right? Like, you know, I think people ask me, you know, what is it when you ever make some money, right? And I think one of the best things to, you know, is to go buy your parents something ridiculous that they will never buy for themselves, right? And I got to do that with my mom, by the way.
Starting point is 00:50:09 And I've never got to do with my dad. So I always feel like, you know, but I think the deeper notion would be to be like, hey, it kind of worked out, you know, I made something of myself and thank you. What about you? Like, you know, how do you feel about your relations with your parents and how it is it all over time? I think, you know, I think me and you share a very similar sentiment. And I just always wanted to prove to my parents that I could be something and do good for the world. And it's something that I took on at the age of 18 when I joined the SEAL teams.
Starting point is 00:50:42 I mean, that's really what pulled me through. You know, yes, I wanted to go to war. I wanted to fight for the country in the highest capacity possible and be the best that I could be because I was a failure up to then. And I didn't make good grades. It wasn't very athletic. I didn't really have much going for me. And so, you know, what really pulled me through all that training and got me in was I had a horrible fear of telling my dad that I failed again.
Starting point is 00:51:15 Yes. And so that stuck with me at 18 in Buds. That's what got me through. That's what got me into the agency, you know, was I wanted to, I just always wanted my parents to know, you know, they did a good job. Yes. that I could make something in myself, regardless of, you know, how my childhood went. And my parents are still alive, and we have a very, very close relationship. How did your patterns react when you first signed up?
Starting point is 00:51:45 Time is a funny thing. It's so limited, and yet we let it pass by us. We're all guilty of putting important things off until later. If you've been putting off life insurance, check out Fabric by Gerber Life. They make it fast and easy so you can check it. it off your list right now. Fabric by Gerber Life is term life insurance you can get done today, made for busy parents like you, all online and on your schedule right from your couch. You could be covered in under 10 minutes with no health exam required. Fabric is partnered with Gerber Life,
Starting point is 00:52:20 trusted by millions of families just like yours for over 50 years. There's no risk, there's a 30-day money-back guarantee, and you can cancel in any time. And, They have over 1,900 five-star reviews on Trust Pilot with a rating of excellent. Join the thousands of parents who trust Fabric to help protect their family. Apply today in just minutes at meetfabric.com slash Sean. That's meetfabric.com slash Sean. M-E-E-T Fabric.com slash Sean. Policies issued by Western Southern Life Assurance Company not available in certain states,
Starting point is 00:52:58 prices subject to underwriting and health questions. well what is that conversation like that conversation went uh when it came out like i said it didn't make good grades and i had a lot of potential i was a smart kid i was really good at math and um but i didn't apply myself because i wasn't interested and it came out one day he was really frustrated. I just got a report card or midterms or something, and it was not good. And I know that feeling. C was probably the best grade. And he said, it was an argument, and he said, I'm not going to pay for your college. I'm not going to do it. He's like, you're not going to apply for yourself. And I said, I don't give a shit if you pay for my college or not, because
Starting point is 00:53:51 I'm not going to college. I'm going to become a Navy SEAL. And at that, moment, the whole, everything changed. And he, you know, he, I mean, I think he thought it was full of shit because I didn't follow through on anything that I'd said that I was going to do. And he had asked me, you know, how serious are you about it? And I said, yeah, I've had multiple meetings with the recruiters already. I've already talked about it. I want to sign. I was only 16 or 17 at the time. And why did you want to be a seal? I don't know. You know, I just always grew up playing G.I. Joe's watching G.I. Joe.
Starting point is 00:54:32 I mean, your thing was professional wrestling. Yeah, actually, G.I.J. Equally, I'm trying to get my kids into it right now. We have a, do you have a, sorry, do you have, what is your favorite G.I. Joe toy growing up? Ooh, snake eyes. Perfect. Man, badass ninja. Yeah. Snake eyes was a storm shadow, right? Yeah.
Starting point is 00:54:48 And the, and the rich kids would have the sky, the sky striker jet plane. Oh, man. Yeah. Sorry, yes. But that's what got, I think that, so I was always, you know, and early on in my, in my younger years, fourth grade, my dad took a job as a pharmacist in the army. And we went to Germany and that's when Desert Storm was kicking off. So I was always, anytime we were at the bookstore, I was always grabbing all the magazines and books about what was going on over there. And I would just look at the pictures of tanks and helicopters and planes and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and. military war fighters and uh i was always out in the woods building forts carving spears
Starting point is 00:55:30 doing a making bow and arrows and shit like that and so that just that just always got me really interested and in originally in my military career i just i didn't really care i just wanted to go fight in a war somewhere so i wanted to be a marine wanted to be then i started looking into that found force reconnaissance talked to the marine corps at the recruiting station they laughed me out of the office because i was about a hundred pounds soaking wet went to the army wanted to be a ranger or a green beret same deal laughed me out of the office and the navy recruiter stick to set out and said hey you want to be a seal and i didn't even know what a seal was because i was also very end at all the vietnam generation movies like apocalypse now platoon
Starting point is 00:56:19 and all that kind of stuff. And so they gave me a pamphlet on being a seal. And I read it and went to the library, checked out a bunch of books, started watching movies. And I was like, that's it. That's what I want to do. And luckily wound up making it through and followed through. And that took me into the CIA and had an awesome career there.
Starting point is 00:56:43 And then somehow I wound up podcasting. Well, thank you for everything you're done. But I'm so fascinated with just how you signed up because I'm very curious about why people pick the careers they do, especially for something like that because when you're... How old were you when you signed up? 17. Right.
Starting point is 00:57:03 So, super young. You don't really have a sense of the world. But maybe you did because you were in... You kind of been around the environment a little bit. Like, I'm curious about, like, what is it that makes someone, like, all right, I'm going to do this. I'm going to serve my country. I'm going to put myself in harm's way, right?
Starting point is 00:57:20 What do you think some of the general motivations are when people go like, I'm going to go sign? I mean, I think it's always different, especially in the SEAL teams. I mean, I was one of the maybe, no, I wasn't the youngest. I was maybe the second youngest guy in my Bud's class. And, you know, when you go to Buds, you have all these, you have guys that are coming in from other special operations units that had already been to war. and they're just they're trying to to operate at the highest level as they possibly can you know and what they think they want to do and so you've got 18-year-olds up to 30-year-olds
Starting point is 00:57:59 you know that are trying out to become a seal and you know for me I can't say that I was overly patriotic 9-11 didn't happen until I was already it happened to like right after boot camp So I was already in. But even when 9-11 happened, I mean, I knew we were going to war, but I didn't understand what that meant. So like I kind of said at the beginning, for me, it was, I mean, that's just what I wanted to do. It wasn't necessarily just for the country. It was what I wanted to do. I wanted to experience, I just wanted to experience that, you know, at the highest level.
Starting point is 00:58:43 And then on top of that, even more than my own desires, I just wanted to make my parents and particularly my dad proud because my brother and sister were good students, good at athletics, better than everything that I was doing. And I could see that. I could see the interest, especially in my brother. My brother was a really good ball player, baseball. and I could see my dad gravitating towards, you know, what my brother and sister were doing and kind of like, well, Sean's just the fuck up of the family. And I knew I needed to, you know, from my own mental health, change that to, to, I just wanted my dad to be proud of me.
Starting point is 00:59:29 Was there a moment through your career serving or else, maybe you're like, man, this is the moment I know my parents are proud of me? Yeah, after how weak. after Hell Week. Do you know what Hell Week is? Okay, yeah. But thanks a lot, by the way, to your show and hearing people talk about it on your show too. Oh, cool, cool. But yeah, that was the moment, you know.
Starting point is 00:59:53 That was the first moment that I felt like, man, even though I'm only four weeks in to over a two-year process, I was like, that's the biggest hurdle, you know, and so that was my dad was my first phone call. And I said, Dad, I'm, you know, I made it through. The rest, it's going to get easier from here. It didn't get easier, but that was the big hurdle, and it felt really good. And then graduating Buds, getting into the SEAL team, first combat deployment, finally, you know, getting into contracting at CIA. I mean, it's just, it's always in the back of my head, even still today, you know, with the interviews that I do.
Starting point is 01:00:39 I take this very seriously, and I've made a lot of mistakes in the podcast world, but in general, I just want to, I have an ability to get stories out, you know, and make people comfortable during these interviews. And, you know, it's, it feels really good to be able to take somebody, you know, like I was telling you at breakfast, to take somebody that nobody. Nobody's ever heard of who's starting a business, who's been through struggles and get them in here and get them vulnerable, especially with some of the special operations guys that have, you know, there's a cost that comes to doing that job, both mentally and physically. And there's a culture within that. And, you know, the culture within the seal teams and the special operations and even just the military community in general is not one that's widely accepted by. civilians i mean there's a lot of there's a lot of drinking debauchery womanizing fighting bar fights i mean and and and now i can't even remember where i'm going with this but uh but to get somebody yeah to get somebody vulnerable to talk about what that's like and and suicide attempts
Starting point is 01:02:02 and stuff after and a lot of these guys you know they they get out they don't have a voice they aren't able to document what happened over there and to showcase all of that, you know, what their career was like, what it was like getting out, the suicide attempts, the drug addictions, the womanizing, the infidelity, and to have somebody come on here
Starting point is 01:02:27 and talk about all that and how they got out of it, you know, because a lot of veterans feel trapped. It's, it's, they don't fit in with regular society, You know, and regular society is fascinated in the lives of what me and my, you know, former colleagues used to do. And so to open that up, you know, it's like giving the American, not even just the American, the world. I mean, it's a huge podcast now. People all over the world listen to be able to get a peek behind that curtain on what that life actually is like from somebody that's lived it.
Starting point is 01:03:07 um is awesome and then the audience gets attracted to they they they they they're invested i mean they we go from childhood all the way to current date and to give somebody a glimpse into what that journey looks like is i mean it before this podcast was it was unheard of you know you didn't get that deep and so when the audience gets invested and do a story like that these guys that are, you know, I think a really good way out of that downward spiral after military service and after war fighting is entrepreneurship is a great segue out because we talked about purpose, right? It breakfast today and help. People need a purpose, whether they were a factory worker or, you know, auto mechanic or whatever. And we were talking about AI taking jobs. I mean,
Starting point is 01:04:03 for military guys, I think entrepreneurship is giving. gives you for anybody. I think it's a enormous amount of purpose. And that and and guys coming out of the military and women, I mean, they have, especially in the special ops community or people that are wanting to be the best of what they can be, what they do. I mean, they're going to, they're going to implement that and whatever they do, but they don't necessarily get the, get the, get the traction, the exposure that they need to, to create a successful business journey as an entrepreneur. and we've had time and time again we've had guys in that they didn't have the exposure but I could see the drive you know and I could just see them in the hamster wheel like not getting any traction so I'll be able to bring them in tell a life story get the audience invested in their
Starting point is 01:04:55 story and who they are as a person it just you know some of these stories are so wild and and real that they don't care what your business is. They just want to see you as a human succeed for all the sacrifices that you've made in the country. And it brings other veterans hope. And it really brings anybody hope. It's like, man, if this guy's dealing with this shit, then maybe I can push through what I'm going through.
Starting point is 01:05:26 And we're able to turn a lot of startup businesses into multi-million dollar businesses really overnight. And that's what I love to do more than anything or take somebody like you. I mean, you have a huge name in the tech world. You're very connected. You're very successful. But I don't, I think there's still a lot of people that don't know who you are, who you are as a person, all your accomplishments, where you came from in India, in a middle class,
Starting point is 01:05:58 in a middle class family in India. I think a middle class family in India looks a lot different than a middle class family here in the United States. You know, and so to be able to bring somebody on like you who, I wouldn't say you came from nothing, but very little, and to be able to come to the United States and build what you've built, you know, the life for you and your wife and your kids. I mean, that is going to bring some, at least one person. that's watching this hope, you know, and because we live in this, you know, we live in this society where it's become very popular to victimize yourself and make excuses on why you're not finding success, and it's always somebody else's fucking fault, right? And, and, but you control,
Starting point is 01:06:50 I'm a huge believer, and you control your own destiny. And when you find what your gifts are, what you're good at, what you're interested in, I don't believe there are limits. You know, I think, yes, there are limits. Like, I'm not going to be a professional wrestler. You know, I'm a... We'll get you work, like that. Don't cut yourself shots, sorry. But, you know, I'm a buck 80.
Starting point is 01:07:13 It's just not going to work. But I find the things that I'm interested at, that I'm good at, and I don't think there are limitations. I think the sky is the limit. You can probably go higher than that. And you are a perfect example of that. Thank you. That if you have the drive and the work ethic and you don't spend your time looking for excuses on why you're not successful, you will rise to the top.
Starting point is 01:07:40 And in stories like yours or Palmers or the veterans or Blonsdale or a lot of the people that I've brought on, a lot of these guys, you know, they didn't come from generational wealth. They built something out of damn near nothing. and that's America right there. That's it. That's it. And that's why I do it. And so long circle back. I just want to say, well, thank you.
Starting point is 01:08:10 But, you know, one of the, I don't know how much people can see just the room we are in. But before we started, you kind of gave me a tour. And there's just a powerful space because every object here kind of has some meaning. And so many of them are from people with just. just these insane, amazing stories, often people who have sacrificed so much for the country, and this is so powerful. And I think one of the things that you've just done
Starting point is 01:08:39 so amazingly well, and I'm not saying just going to blow hot air up you're behind, is create a space where people can share things. I think we were talking at breakfast about your episode with one of your closest friends. And one of the things allowed about that was that just seeing him, seeing somebody who's just, you know, such a great sort of masculine figure,
Starting point is 01:09:02 but at the same time, you know, opening up about some of his struggles, you know, dealing with the family, dealing with sort of all the things that he had to overcome. And also in the business world, I was very interested when he talked about how he was navigating the business world, right? And, you know, and I think, so you just created just this amazing space and platform for people who've part of. So, well, thank you for that. I think what you've just done is just amazing.
Starting point is 01:09:33 Well, thank you for saying that. But let's get back to your story. So you go to Seattle and you start working for Microsoft. What did you think of the United States? Oh, man, dream come true. Like, you grew up a lot of places around the world, and you know, you watch this on movies. You know, I grew up on G.I. Joe, by the way.
Starting point is 01:09:55 Like, you know, I have a whole big thing. a whole big collection. I grew up on Top Gun, Independence Day, Rambor, Rocky, all the classics. So you know, so you're watching it on TV, you have an idea. I think the word American dream is thrown on and thank you for that. I didn't know of any of that. I just was like, this is a place where I can do something with my skill set. And one of the ways I can maybe make this useful is a lot of times people ask me, what is there my advice for young people, people in their early 20s. And I remember when I first got to Seattle and Redmond, I was just so lost. I didn't have a driver's license. I had never seen snow before, and it was snowing. And my now wife and I,
Starting point is 01:10:43 we were like, oh, my God, like, we had to walk a mile to the grocery store in the snow. And you didn't have an SSN for a while, which was, you know, because we had to figure out paperwork and we just felt so lost. We didn't know a lot of people there. But when I walked inside a Microsoft building, and I would see these computers with coding windows, I felt incredibly comfortable, because I knew, that I was very, very good at.
Starting point is 01:11:14 And I knew on that, there was anybody in the building, I could go toe to toe with them. And I think that sense of Mastery is very important, right? Maybe that might have been my arrogance, maybe I wasn't like that good. But if you are, you know, if you're kind of lost and, you know, one of the things I tell people is find a way to become a master of one tiny niche thing. So for example, when I was there, we had this sort of this big presentation we had to do for Bill Gates and other senior exec who kind of running the show at the time, Bill Gates, Steve Ballmer, and this guy, Ray Ozzy. And one of the things I did
Starting point is 01:11:59 was, I became the guy who, for the thing I was working on, the voice of the customer. What I would do is I would go look at every single internet complaint, every single, I would call up customers. I'd be like, hey, this is someone who's Microsoft. They'd be like, why are you calling me? And I'd be like, hey, I just want to know what you guys think. And I was not supposed to do that. I was on the engineering side. And the reason I tell the story is because over a few months, because I had the technical background,
Starting point is 01:12:33 and I was building some of these products. But I was also talking to customers on this very niche thing. I became like the worldwide foremost expert on that thing, right? How people were using it? What were they doing? What are the issues they were running into? And by the, it's a tiny thing. Nobody else, I was a master because nobody else cared enough to make them until it's a master.
Starting point is 01:12:56 But the reason it was great for me was when I would walk into a meeting and I would be people who are much older. Okay, now they're younger than me. So at least when you're in the early 20s, you know, somebody was like 40 years old. You're like, oh my God, that guy's ancient. A little bit I know. So, you know, now I'm older. But, but and I had an accent, right? I look different, I'm weirdly tall, as you know.
Starting point is 01:13:21 But when it came to this topic, which is about this kind of programming we could do, that was like, I'm the expert, right? And even if the others didn't know it, I knew it. Because I'd spent every day, every night for months and months, and nobody could take that away from me. So when you have these huge meetings, I once had to make this presentation to Steve Ballmer and these execs. you know, I was comfortable because I had done all the work
Starting point is 01:13:50 and I was an expert, right? And so Microsoft was very intimidating because it had all these legendary people, some of these real icons, but when I built this sort of sense of mastery, I found my way to comfort. So I always talk about it when I talk to young people, but even people are trying to break into the technology industry,
Starting point is 01:14:10 which is, you know, the technology industry can seem bizarre. It has all these crazy, large, and life personalities, like Elon, Palmer, etc., but if you can make yourself and everybody can make themselves the absolute expert in one area, right, you're going to feel so good, right, and it's going to start opening doors. And so that was a big part. The second thing was, I think a lot of people took a bet on me there, and I'm very grateful. Like, so one of things I absolutely underline is I am the result of... of so many things going my way
Starting point is 01:14:48 and so many people taking a chance on me when they did not need to. And let me name a couple of people. And so this is gonna get very nerdy and technical, so sorry about that. But one of the things Microsoft had a lot of where these amazing technical geniuses who were really good at one thing.
Starting point is 01:15:13 And one of the most, you know, the most iconic people was a guy named Dave Cutler. And Dave Cutler, you know, is probably known as probably the greatest programmer of the 20th century, the greatest one or two, okay? If actually Palmer, you know, will tell you, you know, how much he admires Dave Cutler. And the reason he's known as the greatest program of 20th century is he basically built the version of Windows in the 90s that became duly popular. And he's a personality, okay? Ben I showed up he was in his late 60s. He's now in his 80s. He was known for being incredibly rude and mean to people. There are legendary stories of him punching holes and walls, of throwing people out of his office. He was not a person who suffered fools gladly. But this guy had an insane work ethic. He had more money than God, but he would show up to work every single day, you know, at age, it must have been late 60s now, then, now he's in the 80s, and then write quote.
Starting point is 01:16:17 Okay? So I remember, you know, and I was like, man, I used to idol worship this guy, right? And computer science people, they still idol worship. I need to find a way to impress this guy. And I don't know how. I was terrified, right? And so what I would do is, you know, he would show up on weekends. I remember him one showing up on December 26th and working, right, which is not a lot of people.
Starting point is 01:16:37 And so I would start showing up on weekends and just see him in the hallways, etc. And then after a while, this young punk is here. And then after like six months, I found an error in his code. This scared the shit out of me. Because he was, he's basically like royalty when it comes to programming. Like, you know, I think, you know, I don't know, you're going up against Michael Jordan and you're saying, hey, by the way, you know, your jumper shooting form, I found an error, right? Or Steph Curry, you know, that three-pointer, like, I saw something last night.
Starting point is 01:17:13 Like, I think you can fix. So this is on that level. So I was terrified. But I sort of went into his office and I was like, and he was like, sir, I have this thing. And he goes, and he looks at his computer. I'm like, man, this is the end of my career. And he was not firing people too, by the way.
Starting point is 01:17:31 I mean, the end of my career right there. And he was like, you know what? You're right. And then I was like, okay. Then I ran out of the office. But, you know, the thing about him was he was so intellectually honest. So he took my fix. He made this buck fix in his code.
Starting point is 01:17:48 And then he kind of slightly, you know, him and others took me under the ring a little bit. There are others, a guy named Barry Bond, who was a deep mentor for me and my wife. He would have us for lunch every day just to kind of teach us things. And so I was taking a bet on. And Dave was a royalty, right? And it's like Michael Jordan now, seeing some punk kid, and being like, I'm going to take you under my wing a little bit. So he took a chance on me. And then I had that stamp of endorsement.
Starting point is 01:18:18 Other people are like, well, you know, if this guy survived Cutler, at least he can survive, you know, this crazy old person, you know, and he's good. So it helped me. So a lot of people took these crazy bets on me at Microsoft. So when I think about Microsoft, you know, I mean, look, Microsoft's crazy, complicated company and a lot of people have mixed emotional. questions about it. But at that time then, I think about all the people who took a bet on me, even though they had no reason to. And so now for my wife and me, you know, now we are in the position to take bets on people. And I always think about, okay, how am I trying to spot the 20, 25-year-old version of me? How would they look like? Where would they come
Starting point is 01:19:01 from? Maybe they don't know anything about computers. Maybe they don't know anything about AI, you know, let me find a way to, you know, just take a chance. Because I do think one of the most powerful things you can give a young person is this idea that they are capable of much more than what they realized. It's one of the most powerful things you can do. And I have been fortunate to have a couple of people do that to me, but I'd be like, wait, I didn't like know, you know, I could, even quickly simply, I can do this meeting, right? or, you know, or I can start this project, or I can start a podcast.
Starting point is 01:19:38 And, you know, but that was so powerful because you need that belief, right? And I think about now, how do I find ways to give, especially young people, you are capable of so much more than what you realize right now. It may not be easy. It may take a lot of hard work. you may not accomplish it, it's, you know, whatever, but you are capable. And I think that is one of the most powerful gifts that I've been given from my parents, others that I've worked with, and I don't know, whenever I'm possible, I try and find a way, can I capture some of that and give it to somebody else?
Starting point is 01:20:18 You empower people. Well, thank you. People empowered me. They gave me belief in myself and I didn't have belief in myself. Maybe they gave me too much belief in myself. Some people would say that too. but it's a gift. And, you know, I try and pay it forward whenever I can.
Starting point is 01:20:33 I love that. I love that. I think that, you know, I wish more people did that, you know, that, that find success. And I think a lot of people do that. A lot of people that I talk to and here do that. People have done that for me, you know. But. Maybe you had such great stories.
Starting point is 01:20:49 I just want to say, I don't want to put you on the spot. But when you're talking about breakfast, you know, you've given so many of these people, this platform. and, you know, I don't want to reveal this, but you were like, hey, you are capable of doing this and, you know, you're capable of so much more. So I think you have done justice with this podcast, too. Thank you. Thank you. I mean, that's how we make the country great. Yes.
Starting point is 01:21:13 You know, you pass it on, you pay it forward, you empower people, you give them confidence, and just shower them with positivity. And, you know, then it's up to them if they go on to do great things. or continue on the same route that they were. I think that's, in some ways, I think that it's one of the most beautiful things about America, which is this idea that you have a shot. Right. Like, you know, if you, and I'm not going to pretend it is easy for everybody, right? You know, a lot of people, you know, just come from different walks of life,
Starting point is 01:21:50 have bad shit happened to them. But I think at the core of it, one of the promises of this country is that, you know, if you work hard and if you apply yourself and you do all the right things, like good things will happen, like you have a shot. A lot of other places in the world don't give you that, by the way. You know, you're kind of told, you have to stick in this lane
Starting point is 01:22:10 or you don't have any opportunity. I do think, you know, at the core of this place is the sense of opportunity. Yeah, right? And I, by way, in some ways, we can get the layer, but I think about with AI, you know, how do we as a country give, expand the opportunity set, give people way more opportunity than they had before.
Starting point is 01:22:28 So that's one of the things I think about when I think about my current job. Yeah. Where did you go from Microsoft? Did you go to META or was it... So my wife and I, we idolized Silicon Valley, the culture of Silicon Valley. You know, we, when we grew up,
Starting point is 01:22:46 you know, our first date was, I couldn't afford anything. I had this tiny, you know, slightly smelly, one bedroom. And we had to see each other, and the parents didn't know we were seeing each other, so she would sneak in. And we would watch movies about Silicon Valley, right?
Starting point is 01:23:04 One of the greatest movies about Silicon Valley, by the way, is this movie called Pirates of Silicon Valley. It's a movie about Steve Jobs versus Bill Gates in the young heyday and how they competed with each other. And we would be like, man, someday we want to be there. And we were like, you know, by that that was, I think it was a bit torrented CD. Like, we didn't exactly pay for that.
Starting point is 01:23:26 Like, we were like, we couldn't afford the actual real thing. Hopefully, I don't get in trouble for, like, 24 years later. But, you know, we were like someday we wanted, because Silicon Valley was the land of opportunity. You went there, you know, you were good at the computers. You could make something of yourself, right? So for me, like, you know, like, for example, if you're playing football, what do you dream?
Starting point is 01:23:49 You want to play in the Super Bowl, right? You're a pro wrestler. You want a main event, WrestleMania, hopefully at Madison Square Garden. But for me, right, like, well, I also want to main event in WrestleMania, but, right, I wanted to make something else in Silicon Valley. So anyway, so in 2011, 2012, we decided, okay, Microsoft has been great for us, and we did great. My wife and I had fantastic careers there. We made a lot of friends, you know, we did very well for ourselves.
Starting point is 01:24:15 We were like, we want to take a risk and go down to Silicon Valley. We just got mad at two, by the way. And the way we did it is we said, okay, there are two of us. So, you know, we're going to sort of hedge risk between us, and one of us is going to take a chance and build a startup, and one of us is going to get, like, you know, go make some regular paycheck and job. And so one of us is paying the bills, and the other person can kind of follow this entrepreneurial path, and we can kind of take the pressure of each other. And the deal we made with each other was that we will kind of alternate. Like, somebody will go start a company first, somebody will go get, like, a regular job, and then maybe we'll switch it on. Like, we didn't know what we were doing.
Starting point is 01:24:54 We were like, we just wanted to be part of Silicon Valley. I will say Silicon Valley is a magical place. And I think it is one of the unique advantages America has over a lot of other parts of the world because the combination of capital which flows in, the combination of talent there, the density of the companies there, the idea of anything.
Starting point is 01:25:17 It's not perfect, a lot of issues with it, which we can talk about, but it is magical and it is a unique. So we wanted to be there. So we quit our jobs at Microsoft, and we flew down to the San Francisco Bay Area. I'm like, all right, what do we do? And so year later, my wife, do you know what Y Combinator is? Yes.
Starting point is 01:25:40 Okay, so Y Combinator is this very popular startup incubator. It's maybe one of the most popular startup incubators. And at the time, they were maybe not as popular as they are now, but they were still a bit popular. And what they would do is if you had an idea, you go to them, you apply, and they pick maybe 20, 30 companies a badge, and some amazing companies have come out of that. Airbnb, Coinbase, Brian Armstrong,
Starting point is 01:26:06 I think talked about Y Combinator here, Dropbox, lots of great companies that come from there. And so my wife had this idea, she applied, she got into YC. And YC, by the way, doesn't give you a ton of money. I think at the time, they gave you for the entire company, maybe like $80,000 or something. So it's not a lot.
Starting point is 01:26:25 Like, you know, and I was like, okay, I need to figure out, you know, how to, you know, just get a paycheck, you know, get some regular money. And so I wound up joining what was then Facebook. And Facebook now called Meta. And some of the older people here in my room with this, Facebook then had just gone public. And they were in a dark hole because, number one, The IPO had gone terribly poorly.
Starting point is 01:26:53 They had gone public, I think, at like $45, the stock place had plummeted. Second is there was this big question about the whole world is moving from desktop computers to mobile. And Facebook has only making money on the desktop. So there's this big question of, first of all, the idea of social networks making money was sound, seem laughable. Like people just laugh at because people had seen MySpace. They had seen all these other companies fail. They were like, I don't even know why this is a thing.
Starting point is 01:27:23 Second was, they were like, nobody can make money on mobile. So they were in a bit of a hope. And I had some friends there, and I told myself, I want to do something which is very different from Microsoft. I want to do something which involves consumers, because I really was interested in consumer psychology. You know, what makes human beings use products, how they interact with products,
Starting point is 01:27:45 how to build the technical algorithms. Now maybe AI, you know, we didn't have that really that word then, which kind of interacts at them. I was very interested, and I just wanted to do something different. So I got a job at Facebook, and I wound up working on Facebook advertising. Now, now it's a monster, you know, or a dollar company.
Starting point is 01:28:03 At the time, it was like, the stock was down. And again, I got very lucky. You know, I wound up building this ad ecosystem product called the Facebook ad network, and with some amazing smart people. And we went from $0 to a billion dollar business in three months. In three months? Yes. I remember, like, you know, I think $2.7 million a day, a billion dollar a run rate, and it was a rocket ship.
Starting point is 01:28:33 And I think, look, again, when you look back now, I think we were lucky for two things. One, people were starting to buy things on their phone, right? Because the iPhone had come out in 2008. I think the app store came out maybe a few years later, right? And one of my mentors tells me, like, you know, there is no difference between being wrong and being early. They're just the same. But in 2012, 2013, people are starting to buy things.
Starting point is 01:29:03 So there was commerce happening. So when there was commerce happening, people wanted to advertise and push their products, Target, Amazon, mobile games. Second, Facebook at the time, you know, had actual authentic people, not like bought. which was a huge problem in the internet. And third was we were able to marry that
Starting point is 01:29:22 with these algorithms and products. And the big lesson I learned at Facebook is just like the power of working with great people. So because I had this small team, this guy named VJ and others, and who were just fantastic engineers. These are kind of people who would go off on a weekend and they would rewrite an entire system
Starting point is 01:29:48 which had taken a year for a team, and they do it over a weekend. Palmer Lucky and I shared a common sort of hero figure, a guy named John Carmack. Did you talk about John Carmack when he came on the show? I believe he did. He did, right? So John Carmack, you know,
Starting point is 01:30:04 I think, by the way, one of the idols of the 20th century, John Carmack was the guy who built Doom, the video game Doom, right? Oh, man, I used to love that game. Yes, yes. So Doom, and so Carmack is a programming game. God, right? What he would do is he invented basically what is called the engine, which ran under Doom,
Starting point is 01:30:25 and then under Quake and all these engines. And then eventually, he met Palmer because he was very interested in VR, and they wound up doing Oculus together. And then, you know, Carmack got hired into Facebook, right? But the reason I bring up Carmack is that, you know, there's this great story at Facebook where they realized, like, Carmack individually
Starting point is 01:30:48 was doing the work of entire 200-person teams. Holy shit. Just one person, guzzling diet cook, right? Like, I think he now lives in Dallas, right? Like, sitting by himself, just, like, doing the work of 200 people. At one time, you know, Facebook HR realized, like, you know, they had nowhere else to promote him to because they just run out of levels to give him. And he was just a machine.
Starting point is 01:31:15 And so the reason I bring this up is it is very hard to make sort of get yourself to the top of an industry unless you know what greatness looks like. So I was lucky because when I joined Facebook, I was surrounded just by sheer serendipity by some great engineer, just like I was at Microsoft, right? So now, even though these guys are, by the just to be clear, they were thousand times better than me. They would run circles around me with anything technical, like, without even batting an eyelid. It's like me playing like a pickup game against LeBron or something, right? But I saw what greatness looks like, so which meant that many, many years later, when I was starting doing investing or when I meet entrepreneurs now, I have a rubric in my mind of, okay, I know what elite talent looks like. I see the work they put in. I see how they talk, how they think, how they spend their free time.
Starting point is 01:32:16 So now I can sort of like, you know, if you know how like Steph Curry shoots trees, and then you know how, you know, the kid you play with at the gym shoots three. You know, I kind of see what greatness looks like. So I was exposed to Facebook. I was exposed to some real greatness at Facebook. So I went up doing ads, it became a huge hit at Facebook. And I think Silicon Valley is one of those places where it has a lot of good things. things, but everyone's kind of looking for, you know, what is the win you've had, right?
Starting point is 01:32:47 What is the thing which kind of like kind of says, okay, you are somebody who's capable of something? And face, that whole thing gave me that. Like, I started to just get known as, oh, you know, she runs this guy who kind of built all this stuff at Facebook. And if you go Google me from back then, I would start showing up in all of these press pieces. It's very important.
Starting point is 01:33:08 I think it put me and my wife on the map in Switzerland. Silicon Valley. And so I'm very, very grateful for that. Wow. Wow. You had mentioned something earlier about five minutes ago. I think you said there's, I think you said there's no difference between being early and being wrong. What do you mean by that? So, I stole this from somebody we should talk about Mark and Reason. Mark and Reason is the inventor of the web browser. He was the founder of Netscape. Did you use Netscape? Oh, yeah. Yeah, great, right?
Starting point is 01:33:41 So the spinning logo with the stars and all of that. So Mark and Reason was kind of one of the original boy wonders of technology world. He built this browser called Mosaic, which then, and he started this company called Netscape, which was kind of this darling child of Silicon Valley in the 90s, and then got crushed by Microsoft. But then 10 years later, him and Ben Horowitz started this venture capital firm, Etrixen Horowitz, which we can get to why I wound up joining later. But for many years, he was a mentor, is still a mentor for me. And he has great many saying.
Starting point is 01:34:17 And so when you have an idea as an entrepreneur, right? Like, I think, you know, there are a lot of things which can go wrong. And one of the things that you have to think about is that is this the right time for this idea? And history is filled with examples of companies that had the right idea, but we're too early and died. Let me give an example. Do you know what the company Instacart does? It's a grocery shipping service, right? Or DoorDash.
Starting point is 01:34:52 What does DoorDash do? Right? Like you buy a product, they bring it in from the nearby restaurant or, you know, deliver your groceries, right? Exact same company as WebVant in the 90s, right? one of the most famous dot-com bus. People ask you, oh, man, what was one of the biggest things in dot-com era which lost money, it would be webband? They were right.
Starting point is 01:35:13 They were just too early, right? Or pets.com, right? Like, another famous.com bus, but people have figured out later. Myspace, right? Like, MySpace was our Friendster. Do you remember Friendster? I don't remember Friendster. All these social networks, but Myspace, I'm sure, you were on, right?
Starting point is 01:35:29 Oh, yeah. You know, it was okay for a while, you know, And, you know, they sold to, I think, Murdoch, but, you know, Facebook did so much better. And I think as an entrepreneur, you know, you have to think about, is this the right inflection time for my idea? Because when you connect the right idea, the right entrepreneur, and the right time in history, magic happens. I'll give an example. I know, if you look at YouTube, right? So people think of YouTube as, you know, just obviously these days is the de facto way to have videos.
Starting point is 01:36:01 online. But when they first came out, it was other people had tried it. Like, you know, Google had this effort called Google Videos. They tried to get videos online. Others had tried it, but none of them had really worked. But YouTube captured this moment in time because digital cameras were starting to explode. And I think you started to see the increase of original mobile phones which had reasonable quality, right? And in 2008, the iPhone comes out. So one is, you know, you you're starting to see more video production, right, like by regular people, right? Without needing a camcorder or doing this home videos
Starting point is 01:36:41 or needing a bulky camera, you're seeing video production go up. Second is broadband around the United States was getting better. Super important to basically get these videos to show up. Like, I don't know, when you remember the 90s, you open up a real player, like, buffering, buffering, buffering. You watch a minute, buffering, buffering, right?
Starting point is 01:37:00 like, you know, in 2008, most of those issues are pretty solved, and you could watch a video without it, you know, buffering. And then, so these guys, like the YouTube founders, like Steve and Javad and all these guys, built this product, you know, which more than anything, just captured this right moment in history, right? Like, and they rode that way. And I think a lot of times, the difference between a $500 billion. iconic company, and some company which runs out of money, you know, is not the persistence
Starting point is 01:37:36 of the heart or the effort of the entrepreneur, right? It is just, they were the wrong time, right? And so when you're an investor, which obviously, you know, I can get that, I spend a lot time as often I was trying to think about, is this the right time, you know, for this idea to happen? Palmer is another good example, right? So a lot of people had tried virtual reality. In the 90s, there was this thing called VRML where you were like,
Starting point is 01:38:03 oh, let us embed virtual reality in your computer. And there are all these movies and TV shoes which had virtual reality in it. The Matrix is the most famous one, but there was a Pierce-Brossden movie. There were all these things that had virtual reality. But the challenge was that the hardware was too complicated. Nobody could figure out the hardware. And there was no internet bandwidth to kind of show you these experiences, right? So what Palmer, you know, and through his genius, through his sort of, you know, he told you his amazing story, you know, him sort of like figures out by himself, he sort of captured the right moment in time with the camera gear trying to figure out all the low latency interfaces so that the image that the camera sees is now or the image that is being projected is kind of reacting with you in near latency.
Starting point is 01:38:53 So that was the right moment in time. So I often think about like you need the right time. The other thing you need is you need some luck and some magical lightning in a bottle, especially with consumer products. I think with businesses and enterprises, it's a bit different. You can go call up your customers and you can be like, hey, you know, what do you guys want? I can build that for you.
Starting point is 01:39:16 But with consumers, I do think you need lightning in a bottle. So YouTube, you know, they started taking all these videos. They put it up. I think an SNL video once went viral. They went off to the races. Facebook has an interesting history. Everybody knows about the Facebook history, about Harvard. They've all seen the movie with the social network movie.
Starting point is 01:39:36 But one of the other untold piece of Facebook history is that how did videos become a thing on Facebook? Because for the many, many years, Facebook only did photos, right? This all seems like ancient history, and anybody under the age of 30 is like, hey, folks, what are you talking about? I'm on TikTok, Instagram. but for a long time, you posted a photo on Facebook of your friends and you tagged them.
Starting point is 01:39:59 No videos until do you remember this thing called the Ice Bucket Challenge? Yes. Right? This is internet history. The Ice Bucket Challenge was basically people raising money
Starting point is 01:40:12 and what you would do is you would take this bucket of cold water dunk it on yourself with a video camera in front of you and then most importantly you would then shout out four or five people. You'd be like, hey, I'm going to, you know, challenge my celebrity friend to go to the
Starting point is 01:40:28 Ice Bucket Challenge, right? Edward Spree. So I was at Facebook at this time, and, you know, Facebook had just launched video, but they could not really get people to upload video or what was going on. But the Ice Bucket Challenge had this remarkable couple of properties. One was that, you know, it was video and people had to create it on their own phone, so it It was personal, but second, you need to tag your friends because you're tagged some friends. Facebook just happened to have, you know, a platform where you could post a video and you could tag a friend.
Starting point is 01:41:02 You could not do it on YouTube. So I remember being in this meeting where with Cheryl Sandberg where there was this vertical straight line and that was usage, right, because everybody, you know, was uploading videos onto Facebook. And so every platform has one of these stories. With AI, I would say it's Chad GPT, for example, which has a story. But I think my point is that with consumer products, you need writing in a bottle and you need the right timing. And often, you know, when I talk to an entrepreneur, I often ask them, what makes you so sure that now is the right time? Not four years ago, not four years from now. Why is today the right time?
Starting point is 01:41:41 Interesting. I'd never thought of it like that. And your explanation makes me a hell of a lot of sense. What, you know, you went to work at Facebook and created the ad network and had a big part in the videos and, and was it, it sounds like he had a very big part of making it tremendously successful. Your wife went into entrepreneurship. Yeah. What did she do? So she started this, I want to tell people my wife is the more impressive person of the pair because it is true.
Starting point is 01:42:13 She's a multi-time entrepreneur. She's actually been through the journey so many times. So she started this company, which was about renting electronic gear. The idea, which by the way, I think she would say was maybe a bit ahead of its time going backwards, our sort of theme. The idea was that instead of renting, like, for example, you were surrounded by several high-end camera gear, but imagine you'd want to buy it, you know, could you just rent it and then try it out? because you're going on a trip, or you're going on a photo shoot,
Starting point is 01:42:46 or maybe you're renting a drone, that was a big deal because drones are very expensive. Somebody was doing an ad shoot or somebody was doing, like, I just want to do this one scene for a day, and they would rent out this gear to them. So she raised money, you know, from Wicominator, from a bunch of people, and they did very well for a period of time, you know, they, you know,
Starting point is 01:43:08 I think they are like a, you know, I'll say a couple of dozen employees, maybe. They had a lot of customers. They love them. But I think, you know, the challenge that they ran into is, she would say, is a little bit early because now you're actually seeing other companies do that at scale because so many other consumer product categories have exploded, which need that dynamic. And at the time, I think she was doing cameras and drones.
Starting point is 01:43:30 So, yeah, she had a reasonable outcome, you know, and, you know, I think, and she had a great experience, but she was also, like, a bit early. but also at the same time my wife and I this is when we started do some angel investing together and she's a fantastic investor that's what she does now
Starting point is 01:43:52 full time but we had a tiny bit of money we had a little bit of money we had made from Facebook and one of the things about Silicon Valley is that you know you just start meeting people who are starting companies all the time
Starting point is 01:44:07 and we started to learn how to do angel investing, and we would write, like, incredibly, I would say small amounts of money, compared to what a lot of, like, other angel investments in Toronto, like, sometimes we wrote, like, a couple of thousand dollars, right? And just because we wanted to support someone who was a friend. But I think that over the next four, five, six years taught me a lot about investing. It brought me in touch with fantastic entrepreneurs, actually, some of whom been on your podcast before, and then in a way it kind of led me and her to our investing careers much later. Interesting. Yeah. Very interesting. So why did you wind up leaving Facebook?
Starting point is 01:44:52 I was bored. I was a bit, like, Facebook is, I think there's a pattern through my career where once I've sort of felt comfortable and settled in, I want to be like, okay, I want to find my next thing which really challenges me and push me. So I think, look, they were great. You know, a lot of friends there at the time, some of the left, et cetera. I could probably stay, you know, I was making decent money. I had to make more money. But I was just like, I just want a different adventure.
Starting point is 01:45:25 I have that problem. Yeah, yeah. Like, I mean, if you look at my, you know, when you're reading my bio, I was like, man, there's a lot of stuff in there, right? Like, you know, there's like, this podcast, brus, like, you know, and part of it, I think, It's just, I just, you know, I've seeked out different adventures over time. And so I was very comfortable there. And also, as a part of it, like, I'm very suspicious if I'm very comfortable
Starting point is 01:45:50 because I'm like, man, am I stagnating? You know, do I need to push myself, right? Like, I'm also very competitive. So I would be like, I'm not pushing myself hard enough because I'm just showing up in this job where Facebook at the time become a big company. They don't really need me, right? Like, you know, because that's how you build these big companies.
Starting point is 01:46:12 Like, you don't, you shouldn't have to need anyone. Like, that's how these companies are built. So I just, I get very restless when I'm comfortable. One of the good things about my current job, you're never comfortable. But I was just like, I was just bored. I wanted an adventure. And I was kind of, I was advising a few entrepreneurs. I was doing some investing.
Starting point is 01:46:33 but I would say what eventually that path led me down to is the first of two times at Twitter, right? And so Twitter at the time, you know, there's a guy named Jack Dorsey who was running it, who was a founder, and they had been through a lot of really bad, like, turns as a business. And one of the things I like to compare it is, like, Facebook and Twitter as companies, because at a time, in 2008 or 2009, Twitter, by the way, of course, Of course, no X run by Elon, which we can come to have had some little more than there.
Starting point is 01:47:07 But Facebook, it was a time in 2008, 2009, Twitter was seen as the potential $100 billion, $200 billion, you know, company. And Facebook was like, oh, just this toy social network that just will never spread outside colleges. That obviously flipped. And part of the reason, you know, I think is that Facebook had a very methodical way of using metrics and data and numbers and experimentation. Like, one of the lessons I really learned from some of the people there was to always, you know, be distrustful unless you have the numbers and the experiments to prove it. And the numbers back it up or the experiments back it up, you should be prepared to change your line of thinking. One of the things I think Facebook has been very good at is changing how they believe, what they believe about things when they have new data, at least the time I was there. Very different company now.
Starting point is 01:47:59 You know, I don't really spend any time with them now, obviously. So I kind of learned that mentality about, like, okay, if I run this experiment, if I try something, if I learn something different, I'm going to change my worldview, right? They were very good at that. Twitter was very different, right? They had this product, 140 characters. It had worked. And then one day it wasn't really working, and they were a public company. They were making enough money.
Starting point is 01:48:22 They had a CEO change. So it was a little bit of a spiral, right? Like you're a public company, you know, people are comparing you to the other companies. you're getting like a new CEO every year or two. So it was a bad place to be. And they were spiraling and spiraling. And at the time, you know, I love Twitter and now X as a product. I use it all the time, right?
Starting point is 01:48:42 Like, you know, it has given me so much. A lot of my personal relationships have come from that. Professional relationships have come from that. I think it's just a fantastic product for the world in terms of, you know, just what it has done. And, you know, somebody reached out to me and say, hey, do you want to help? And I was like, yeah, sure.
Starting point is 01:49:00 Like, you know, it seems like a challenge. The company is like a bit of trouble. It was not glamorous, right? Like, you know, it was not a sexy company to work it. But I really like it and seems like a challenge. So I want to joining there. And that was quite the thing because one of the things I did not realize at Twitter was how insanely political, you know, in many different ways, you know, it was on the inside, right?
Starting point is 01:49:24 So that was quite the adventure. But that was, again, so I joined Twitter, I think, in 2017. 2017 yes you know it before we go farther i'm just curious i mean what are your thoughts about just social networking in general i mean it's caused a lot of problems in the world i think it's also connected a lot of people in the world i mean i've i've made a lot of contacts off x facebook Instagram, not so much TikTok, but, you know, but I feel like it's a double-edged sword. You know, it's also a great way for government, anybody, to kind of map out who you are. It's immediate, I mean, I use it when I get messages on any one of these social platforms,
Starting point is 01:50:16 and it seems like an interesting message. The first thing I do is I go to the friends list, see who they're, see who mutual connections are. And so it's, you know, I mean, you're basically what a lot of people, myself included, I mean, your life goes on to these social networks. It's easy to map you. It's easy to map your connections. Everybody you're connected to who they're connected to. I mean, I'm just curious, you know, what are your thoughts on all of it from, you know, social media has completely revolutionized the world in a multitude of ways? Sleep is one of the most important parts of my health, both mentally and physically.
Starting point is 01:51:00 Since getting my Helix mattress, I've noticed a huge difference. Before, I never felt fully rested. Now I'm sleeping through the night and waking up refreshed, and it's made a real impact on my life. If you're looking to upgrade the quality of your sleep, now is a great time to try a Helix mattress. Helix is the award-winning mattress brand, and it's recommended by me. many for improving sleep. Helix is made to fit your body type and your sleep position. And they can recommend which mattress will work best for you. And right now, you can save when you decide to buy your own Helix mattress with this offer for my listeners. Go to Helixleep.com slash SR for 25% off
Starting point is 01:51:44 sitewide. That's Helixleep.com slash SR for 25% off sitewide. Make sure you enter our show name after checkout so they know we sent you. HelixSleep.com slash SRS. It's such a complicated question, right? You know, first of all, like, as somebody sort of worked
Starting point is 01:52:04 in a few of these things, right, like, I have some biases. Like, when you're a part of an institution, right? I've been part of multiple of these institutions, right? Like, you sort of have an affinity. You want to defend it. That's number one. But the second part is I've also
Starting point is 01:52:19 seen the abuses that happened when social media, you know, has real power, right? I've seen the censorship. I've seen, like, the centralized control that can happen. And I've seen the harm that it can do for kids, for example. You know, when we grew up, I didn't have to worry about, you know, was it a popular kid, would I get bullied if I said something? I have a lot of friends with adolescent daughters, sons,
Starting point is 01:52:53 and they have to worry about that. It is a very complicated topic. I think the thing which I often think about is, one of the things I saw at Twitter was how censorship or nudging political agendas can happen, sometimes deliberately, sometimes even not deliberately, and how they can have huge impact. Right?
Starting point is 01:53:16 So for example, people think of like the Twitter files and, you know, all these examples of censorship happening at Twitter, right? And I was there when some of this happened. And one of the things I always sort of fought back on was like, you know, the influence of politics inside these social media algorithms, okay? So often people kind of often know, I think, the easy ones. They are like, oh, my account got shadow banned, right? Or I said something, like, for example, in COVID, you know, I said the virus came from a lab and, you know, my account got blocked.
Starting point is 01:53:51 Which is, by the way, you know, the NIH director, Jay Batacharya, he said that, his account got blocked. And think about his story now, you know, he's now, like, running the NIH. But Twitter deleted his account. Like, for basically saying, hey, maybe the virus came from a lab and, you know, maybe I'm not totally sure about the mass thing. And they deleted his account. But what I also saw was like how easily algorithms can be used to shape politics. I'll give you a story. I was Twitter one day and I wake up and I'm scrolling the internet and I see this story.
Starting point is 01:54:26 And I won't say who, but it is his famous Hollywood actor. And it says that this Hollywood actor has this movie project that was just announced, but the internet is mad at him for getting this movie project and he might lose it. And the reason it caught my eye was, you know, all these stories would embed the same two or three tweets inside. And I worked on that. You know, I ran a lot of the algorithms. And I was like, these treats look a bit weird. Like, there's no followers.
Starting point is 01:54:58 Like, there's no likes, re-tweets. How did these get found? Right? Like, why are these kind of getting surfaced up? So I had some free time, and I sent some emails, and I poked around. And it turned out, what had happened is, do you know what trending? Trending on Twitter? So at the time, the trending algorithm on Twitter
Starting point is 01:55:18 had all sorts of issues, all sorts of bugs, right? And what it would do is sometimes it would try and tell you, hey, this is what people are talking about right now on Twitter, right? It's trending. Sometimes we just kind of pick a random arbitrary thing and say it's not trending, right? When it couldn't find anything.
Starting point is 01:55:36 And what happened one night, that previous night, was it had found a random set of tweets And because you could not find a legitimate trending topic, this algorithm, by no one's fault, or someone's fault, but not placed out of any deliberate agenda, right? It said these tweets by total unknown people about this actor is important, okay? So this happens maybe at like 2 a.m. 3 a.m. And then in the morning, right, like what winds up happening is on Twitter, there was this product called, there was a way where they could highlight these tweets and kind of give it big imagery. So one of these people wakes up. And some of these folks are, you know, just sometimes had a political agenda. But I think they were just like, hey, I just want to do my job, you know, and this thing is trending.
Starting point is 01:56:19 They did not know this came from an error. So at 5 a.m. or 6 a.m. New York time, they take this and they go say, this is a thing which is happening on the Internet. A couple of hours later, all these editors of all these, you know, media publications wake up, they're like, oh, there's a thing people are talking about Internet. Hey, can you chase this story down and write about it? And by the afternoon, you know, I think that whole thing, you know, that the guy's agent was getting calls and be like, oh my God, like what is happening, right? The internet is talking about this. And the reason I bring up the story is, you know, first of all, it's kind of silly.
Starting point is 01:56:53 It's like a Hollywood movie. Nobody really cares. But it tells you, right, like how almost easily inadvertently you could shape the discourse, right? shape the arc of something, you know, so easily, right? And the combination of the algorithms, the metrics, and some of the people inside Twitter meant you could really influence the world. Like, I often call Twitter, you know, a memetic battleground. You know what means and memetics are, right?
Starting point is 01:57:28 So Twitter is a place where all these ideas fight. And if you win, you get to spread all over the world, a little bit. And so one of these things that people want up doing was finding ways to sort of inject their idea into this memetic battleground and try and fight everyone else. Now, this was rewarded by the algorithm, because what the algorithm was doing was it was looking at, okay, what is getting the most attention likes retweets, right? And these were often things which were provocative, like have, like made people angry, right? Like, and then those people would get followers. Okay. So there was this kind of the system which was being created where if you said something provocative and you made people angry, right?
Starting point is 01:58:20 Like, one, you could kind of shape the discourse. Second, the algorithm would be like, oh, this is what people on Twitter want because it's getting more attention. I'm going to send you more followers. I'm going to bump you up in the algorithm, right? And this had two, I think, catastrophic impacts, right? One was that the people who were sort of exhibiting some of the worst behavior, right? Like, we're getting rewarded, right?
Starting point is 01:58:46 Like, the more you get people angry, the more, you know, you get people outraged, the more attention you're getting. But the second more subtle thing which was happening was when somebody knew signed up on Twitter, right? Imagine you walk into a restaurant for the very first time. You walk into some fancy smancy Michelin restaurant, and you're like, okay, everyone's here in a suit or dressed up, right?
Starting point is 01:59:11 And you're like, I got to look the part. Or let's say you go to a sports bar, right? Like late night, you're watching a game. It's a bit rowdy. You know the vibe. The thing which Twitter was doing was it was shaping the vibe to be one where anger and provocation was rewarded as opposed to kindness,
Starting point is 01:59:31 as opposed to education. I'm not saying it wasn't exist. There was a lot of it. Obviously, you and I and others have had great experience. We're sort of pushing the ball over in one direction, right? And when new people signed up, they were like, oh, I don't know what this place is. Oh, who's doing really well here?
Starting point is 01:59:46 Oh, it's that guy who's getting people angry. By the way, this happens on YouTube. This happens on platforms all the time. What do people do? They're like, what are the videos that are doing really well? What are they doing on the thumbnail? What are the titles look like? Let me copy that.
Starting point is 02:00:01 So one, you're kind of giving people who were exhibiting somebody more wealth, some literal wealth. Second, you were training everybody new at Twitter. Those are the icon. Those are the people you should follow, right? So there was kind of this whole system which evolved. And so anyway, my point of this sort of roundabout explanation is, one, it taught me the absolute power of social media. Second, that how these systems, when centralized, can have real power and real censorship and how it is important that you have absolute transparency, number one.
Starting point is 02:00:39 We need to know what algorithms exist in these platforms. We need to know that there is no ideology in these platforms. So it kind of really was an awakening moment for me. It kind of really shaped me. Think of me as somebody on the inside, like, oh, my gosh, like, you know, we need to sort of fix these things. And this is why every platform, by the way, I'm not picking anybody, everybody, every platform. The second thing, it kind of really sent me down the path of was decentralization. This idea, which I think is kind of a lot of the crypto people, you know, would really resonate with.
Starting point is 02:01:12 It's like, we as people need to have ownership over how these things work. We need to have a say. We need to know what is happening behind the curtain. And maybe we just don't have like one big thing. We need to have a lot of other small things competing. So it may be very distrustful of top-down centralized control, and it made me feel like, okay, I need to find a way to bring more decentralized systems into the world, which kind of why I wound up in crypto for a while.
Starting point is 02:01:41 But yeah, so social media, I think, shaped my career. It's given me a lot. But I also saw a lot of sides of it. My Twitter time was deeply formative for me. And when I left, you know, I want to be investing, but part of it is like, I need to find ways to battle some of the negative things I saw over there. That makes a lot of sense. You know, that makes a lot of sense.
Starting point is 02:02:03 It's just such a powerful tool. I mean, back to, you know, 2020 election time frame. I mean, we saw what appeared to me, you know, from an outsider is, you know, we saw a sitting president get banned on X, get banned on Facebook, get banned on Instagram, get banned on YouTube, banned on all of these networks. And so, you know, the way it appeared to, I think, everybody, is that these top guys, you know, that on these companies are controlling the entire narrative, everything that's going out. And, you know, it appeared. And I don't think it appeared.
Starting point is 02:02:44 It was that way. Everybody leaned one way. And, you know, that was, you know, obviously towards the left. And, and, and, and it, it, I mean, they deple, it was that one platform, they, they just pulled it, a AWS parlor. Parlor, yeah, they just pulled it. I'm not, I'm not sure why I don't know what is going on in there. But, you know, it was like, holy shit. Like, now there's an entire party. Oh, yes. You know, with zero voice to include a sitting president. And that really, I think, opened everybody's eyes immediately of, of how powerful.
Starting point is 02:03:21 and in dangers so that this could be? The deep platforming was huge. I think in a way, like, I had sort of seen the building, I had seen how it was, what was building up. So I was not terribly surprised when all of that happened. But if you think about that era, right, number one was you had the previous administration putting pressure on the social media platforms to take down content, right? Like, they would send these angry emails.
Starting point is 02:03:48 This is now all documented and it's come out. you know, depositions and lawsuits and hearings and whatnot. They would send all these angry emails to say, hey, you need to take this tweet down, right? Like, we don't love this tweet about Hunter Biden. Like, we don't like this tweet about, you know, mask mandates or whatever the case may be. So they would send this down. The second thing, I think what is happening was this was the peak of the culture wars. And these platforms where, like, there was this domino effect, where if one person, you know,
Starting point is 02:04:21 took down a video, a piece of content, you know, downranked something, shadow band, for example, then every other platform immediately felt pressure. Because what would happen is all of a sudden, you would get an email from the former White House, a member of Congress, or a New York Times hit piece,
Starting point is 02:04:44 which would be like, hey, those guys have taken down this thing, you guys are leaving this up, and you are responsible for all these kinds. And so they try to enforce the overturn window of conversation, and they try to really shrink what can be said. And I think, you know, in some ways, I think the platforms, I think the turning moment in my view, and some people may disagree, is when, well, obviously, before this election would be when Elon bought her. because I think that totally changed how free speech on the internet works because all of a sudden you had one platform
Starting point is 02:05:27 and by the way I was there I was there on the first day we kind of jumping out a bit but I had spoken to Elon and he told me he was going to do this I had left rid of the time for a couple of years and I joined this venture capital firm but Elon said look I'm going to do this thing
Starting point is 02:05:45 can you come help me out? And for me, it is a little bit like getting a do-over because I could go back, and this is not an official capacity. I was not an employee, but maybe I can go back and, you know, I didn't have the power to do some things in. I don't have $45 billion to spend on a social media company.
Starting point is 02:06:05 By the way, that was a problem I had. But now I have a chance to put some things right. So do you remember when Elon bought Twitter the Let It Sinkin Day? and he bought the sink in. So I showed up that same day, I kind of snuck in making sure the TV crews didn't see me. And that was such an experience,
Starting point is 02:06:25 just watching Elon work and trying to sort of clean up the company, finding all the ways that, you know, sort of these progressive levers had been hidden. And for me, there was a chance to kind of put right some of those things. And I would say that moment change how free speech on the internet works. Right? Because all of a sudden, you had a major platform, Twitter 10x, which was trying to live by free speech, which brought back the president, right?
Starting point is 02:06:56 You know, which brought back a lot of people who had been taken down for just asking simple questions like, hey, what is the origin of the COVID virus? Okay? Are masks effective? Right. Or how can you say that we can't sit together in a restaurant but a pro? test is okay, right? Like, how does that work? But even just the ask a question, we just get your account plan. And here's a platform that was bringing it back. And look, people have a lot of complicated views on Elon or the company, but I do think that moment was huge in changing how
Starting point is 02:07:33 free speech on the internet works. And I think even today that is not appreciated for how important that moment was. I mean, I think it revolutionized everything immediately. And I I don't know. Are you on Twitter? Are you active then? You must have been. Yes. I probably wasn't very active.
Starting point is 02:07:52 That's probably the, I don't know. That's maybe the platform I used to be least active. I'm a lot more active on it now. But I really pull away from, I go on there mostly to see how our content's doing. Because I do think that there is a lot of toxicity. I know there's a lot of toxicity that comes out of all these platforms. I try not to fall into that, but, but I mean, I think it was, well, I don't think. I know it was intramental in what Elon did because from, like once again, from an outsider looking in, what I saw was he would have capitalized on the entire market and taken the entire market share by providing an actual free speech platform where you don't have to worry about asking questions.
Starting point is 02:08:44 You don't have to worry about calling out corruption and you're not going to be censored. And I think that all these other companies, meta, Google, you know, maybe Amazon, I'm not sure. But, you know, I think that they saw that happening and they thought, oh, shit, if we don't, if we don't loosen the reins up a little bit here, we're going out of fucking business. And I think they would have gone out of business. And with some of them, I'm actually surprised. that they didn't just because of the repercussions of what they did to half the country. Oh, yes. You know, and, but I think that they saw that, that gaping hole in their business when Elon
Starting point is 02:09:25 secured Twitter and turned it into X and everybody else had to follow suit. Am I, what do you think? Am I wrong on that? I think you're very right. I think it's another dynamic, which is courage, which is he just took a lot of the arrows back And so when he did that, some of these companies, they just want to stay out of trouble. They would love it if there was nothing political
Starting point is 02:09:52 ever said on their platform. They really don't want me the business of navigating where the COVID virus came from. They just want to look, come here, have a great time, have great content, we want to sell some ads against it, let's go home, okay? But that's not the choice. Obviously, when you're running a major platform,
Starting point is 02:10:10 It's almost like running a country. So they got sucked into it. And I think, you know, there's an amazing book you should read from the 50s. It's a bit hard to get on Amazon. It's called Private Truth's Public Lies. Private Truth's Public Lies. Sounds interesting.
Starting point is 02:10:29 Yes, and this idea, and I think it's out of print or something, but it's a very simple idea. It's a very simple idea that sometimes in society, Everybody starts to pretend to believe in a lie just because it is the convenient, expected thing to do. During COVID, right? You would somehow have to wear a mask into a restaurant and be asked to do so,
Starting point is 02:10:56 but you could take off the mask once you sat at the table. You're like, okay, hold on a second. How does, like, these two, logically, this can't, like, totally make sense. There's something wrong here, right? But you can't really speak out again. So if you did, there were consequences. Okay, so what this book says, the second part of this book, and it's a great book, please, I'm oversimplifying,
Starting point is 02:11:16 is what it takes is it takes one person or an entity to point out the emperor has no clothes. All it takes is one, right? But when you have that one person do so, it usually like sufficient, like kind of like risk to themselves, everyone else starts to fall in line. Makes it okay. Makes it okay.
Starting point is 02:11:37 You had two guests on your show who've done this, right? Like, you know, Brian Armstrong with this mission-oriented company statement. Like, did you talk about that when you had him? We did. Right, right. So that's such a great story. So Brian, please go watch the episode because he talks about it. But he basically gets bullied into having to put out a statement, right, which he doesn't want to.
Starting point is 02:12:01 And he doesn't like the fact that he's being bullied. And he says, listen, we are not a person. political, social company. We are in the business of making, you know, crypto available safely, securely to everybody. That's the job. That's the job we are in, right? So he puts out this blog post, which was hugely controversial because he was accused of being racist.
Starting point is 02:12:25 He was accused of Frontenia Times. And he says, like, okay, we are a mission-oriented company, which sounds crazy that was controversial. But he says, if you work here, you are signing up as a part of this mission. which is to make crypto available safely, securely at scale. If you don't like this mission, don't work here. No harm, no foul. Go find a job somewhere else, right?
Starting point is 02:12:50 If you care about something more than this, that's also fine. We just ask that you do not bring that into the workplace. Because in the workplace, we care about this mission. It's a big mission. We have competition. The stakes are high. The rewards are high. It's going to take a lot of energy.
Starting point is 02:13:07 to you, right? And as long as you focus on this, we don't care about anything else. But that was so controversial. Because at the time, you know, I was watching inside these tech companies the rise of DEI, right? Like, I remember, you know, my wife at the time had a little bit of a stint meta, and you would have these employees who just totally hijacked meetings. And they would ask people to issue apologies. They would say, we have to comment on literally everything happening in the world. And a lot of Silicon Valley executives were just scared. Maybe they should not have been scared. Scared of their workforce? Yes, they were scared of their own employees. They were absolutely scared of their own employees, right? Like, they were terrified
Starting point is 02:13:58 of, man, I don't want to seem, you know, pickinist, sexist, racist, racist. whatever it. Like, I don't want to see that, right? And they would get attacked all the time. I spoke to some really famous people who are like, I don't want to hire this exec, but I'm being forced to. Because if I don't, I'm going to be accused of discrimination in some shape or form. And then it's going to piss a lot of other people off.
Starting point is 02:14:28 They don't want to piss some of my investors off or the media off. I don't know what to do. And there were so many people. And I think, look, if you're harsh, we can say they didn't have. have enough courage. But also, look, they had a lot of employees. They were trying to make sure, like, the business doesn't get into trouble. They had customers. And they're like, we don't want to be the political game, but I'm scared. And this was a thing all over Silicon Valley. It was almost, I would say, a rising infection, which has spread by 2020.
Starting point is 02:14:56 A virus. Yes. Like, I remember kind of jumping back a bit, the original moment when I felt the rise of DEI in Silicon Valley. I think this was in 2013. 2014. And there's a company called GitHub. They had a popular developer company. They built, like, source code products for developers. And at the time, they had actually a funny replica of the Oval Office. That's kind of a thing. And they had a carpet. Like, instead of, you know, the presidential, like, seal, you know, with the bald eagle, there was a carpet which said, with their logo and it said, in meritocracy, we trust. Okay. And the, and the, you know, you would think that's safe, meritocracy, you know, if work hard, you know, you're the best
Starting point is 02:15:42 what you do, you win, that was hugely controversial at the time. And I remember a lot of us going, that's weird, right? Like, you know, can you imagine like an NFL draft combine where if you say, hey, you run a 4-4, you know, you're pretty good at maybe being a wide receiver and they're like, well, you can't talk about that's just like racist. But that became a thing. And then I saw over the years, quotas, you know, me, a lot of other people, we would get into hiring situations where you'd be told, hey, unless if your entire team, you know, is a male, white, pick your thing, you're being racist, right, or sexist. And you are like, well, look, I want to hire amazing people, right? I don't care where they are, you know, I don't care what
Starting point is 02:16:34 they look like. I just want to hire the best people to the job, but there were a lot of pressure I know on Silicon Valley execs to be like, okay, my board has to look a certain way, you know, my exec team has to look a certain way. Otherwise, you know, the New York Times is going to come after you. And you might lose your job. And some people didn't lose their jobs, by the way. So this was a bad space to be. And I think a lot of people outside Silicon Valley don't recognize how bad it got. Like, I was a venture capital firm. And, you know, we used to get calls from all these founders, but like, I'm just so terrified of my own employees.
Starting point is 02:17:09 I don't want to deal with this. Like, I'm just building, I just want to build a company. I want to build a product, serve my customers, help my employees, you know, hopefully we all make some money, right? Like, I am not interested in weighing in on the latest social issue, but they were forced. And so the reason I bring this out, so when Brian Armstrong did that, right, like, he was taking a huge risk. And I'm not sure he kind of took enough credit for it when he did your podcast.
Starting point is 02:17:32 But he was taking a huge risk, right? He could have been fired. They were a public company. He could have been ousted. You know, you could have, you had all these ESG firms. You know, you had all these firms. I'll kind of stop doing it now in some ways thanks to who won the recent election. But, you know, it was an idea that, hey, unless you kind of check off all these boxes on the
Starting point is 02:17:56 environment, on diversity, we want to invest in you. And if we are on your board or we're in a captive, we are going to. you put pressure on you to do these things. Wow. And so he took a lot of risk and I think it was not easy for him. But when he did that, so, you know, one, I have a lot of respect for him because to do that, he could have just played it safe. He could be like, yeah, I'm just going to say the things which everybody wants me to say and I'll be in the tower of time and I'll be happy. But he took on the crowd, right? He took on all these folks who were trying to bully him. So number one, he deserves some rest of time.
Starting point is 02:18:32 for that. Second, when he did that, it sort of opened the overton window. People like, oh, wait, I can do that. Like, I can now start, like, saying, oh, wait, maybe we just need the best hire. I don't want to just fill a quota. And Alex Wang, who you had, you know, he did this thing where he said, instead of D-E-I, you know, I want M-E-I, where meritocracy, excellence, and I forget what I stands for. But it's the idea that, you know, we focus on things you kind of want in a workforce, right? You want people who work hard, you know, you want the best or the best, right?
Starting point is 02:19:08 Like, you want, and you don't care how they look like or where they're from or what they do in their private lives. So Alex took some heat for that, Alex Wang, when he did that, but that set the tone. So I have a lot of respect for some of these people. I work with some of them. We spoke about, you know, folks like Balaji Srinivas and who I think are similar,
Starting point is 02:19:29 who took these arrows when they didn't need to. and then open the over 10 window for a lot of people to follow them. I mean, I think Silicon Valley listened, well, maybe they didn't listen, but, I mean, it seems like, you know, talking to, you know, talked to a lot of innovators in the tech space recently this year, and a lot of these guys moved down to El Segundo, it seemed like, for a different culture. And so now we see, you know, and I'm not terribly familiar with it. I've never been to Silicon Valley or El Segundo,
Starting point is 02:19:59 but it seems like El Segundo is rising up is like Silicon Valley, too. Yes. I think yes, for a certain class of companies. El Segundo, that part of that whole community, they have a lot of ties to sort of the defense establishment to making hardware. And so which is why you're seeing
Starting point is 02:20:25 a lot of these American dynamism companies come out there. They can pull talent from elsewhere. And I think the other places which have done start doing really well, Austin, Texas has started doing really really well. I always have a soft spot for like Raleigh, North Carolina. I think they have a great ecosystem there, which is awesome. Sometimes you have the ecosystem outside Silicon Valley, which is great because you get these people who don't have a huge ego. They want to work hard and they're not going to switch your company every year and go join somebody else. But I would say Silicon Valley is still really important,
Starting point is 02:21:05 just because if you think about a lot of these AI companies, for example, a lot of them are in Silicon Valley. And so one of things, just a few plugs in one of the things we've done in this administration to combat a lot of these things about workness, about politics in platforms is we at least this executive order, President Trump signed this executive order three weeks ago, which is called like stopping woke AI.
Starting point is 02:21:33 And I played a big role in this along with David Sacks. And the whole idea behind the executive order is that, you know, it's not that we want you to pick a left versus right ideology. We just want no ideology in AI, right? Like we don't want employees to put their, thumb on the scale and sort of embed their beliefs into such a crucial piece of technology.
Starting point is 02:22:01 And this kind of sometimes accidentally happened. Like, for example, last year there was this sort of infamous incident where if you asked a leading large language model, show me a photo of George Washington. It showed you a black George Washington. And it was sort of an accident. And it's kind of, you know, and I think it's not like super deliberate, but there are other cases which are way more insidious.
Starting point is 02:22:22 And I think, given that and given the fact a lot of these AI companies are in Silicon Valley, which often just have a very left-leaning ideology. I think it's super important to figure out, okay, how do we make sure we don't have a repeat of what I was talking to about what I saw at Twitter? How do we make sure of that? And by the way, with AI, it's going to be so much more important than social platforms. It's going to be so much more important.
Starting point is 02:22:46 It's going to be so much harder to find things. So what the executive order says, and a lot of our things we have done says is, number one, no ideology. we don't we want no thumbs on the scale number two we want transparency right like we want to know you know if you say a certain answer certain you know even example of political comment that's fine just tell us where you got it from tell us what your sources are right like you know let the audience let the viewer let the person interacting with you let them make up their own minds like don't hide it right because I think if you go back to my Twitter story that story about this
Starting point is 02:23:22 Hollywood movie, if somebody could tell how the algorithm was working, right, they'd be like, oh, wait, something's weird. Let me go investigate. So I think sunlight is the best disinfectant, right? I am a transparency, like maximalist when it comes to technology. So with this work AI, EU, with other things we have done, we are like, all right, these things are so important for our economy, for the world. And we want to make sure there is one, no politics.
Starting point is 02:23:50 And also, we know what's happening behind the scenes. How does an ideology get inserted into an AI platform? Is that from the engineers? I mean, it's just processing so much data. I mean, I'm probably way off. But the way I understand it, the way I took it from Alex Wang, is that, you know, you have these enormous data centers, and that's what the AI model pulls from
Starting point is 02:24:20 and processes and, you know, gathers the data processes it and presents it to you. And so, I mean, I would imagine, you know, I don't know what all goes into that, but I would imagine it would be fairly easy for a rogue engineer to insert his own ideology and that and it glows unnoticed by the rest of the company. Is that how that happens?
Starting point is 02:24:41 There are many, many ways how ideology can be inserted. Like, the word ideology is so broad and vague. So I'm an engineer. Let's make it very specific, okay? Let me give you actually maybe a non-AI story and then we come to AI. So when I was at Twitter, right, for a while there was this phenomenon where left Democratic congressmen, which sometimes our democratic political figures would get rank higher in the algorithm. They could show more.
Starting point is 02:25:10 Like in YouTube algorithm, you know, like the algorithm bumps you up just like on Twitter, then figures on the right. And the reason that it happened was when you train the Twitter algorithm, you give it examples of good and you give it examples of bad and you tell the algorithm, hey, go find more things like this. And the people who were sometimes giving these lists, and I'll just give them the benefit of the doubt, right? I don't think they were trying to put the thumb on the scale, but they had certain beliefs.
Starting point is 02:25:40 They stuck the news organizations they followed. They stuck the political figures they liked. They maybe didn't know or didn't approve of people on the other side. So the algorithm kind of learned good to mean a certain class of views, a certain class of publication. Okay? And so those guys start getting more traffic, you know, more attention. And so it kind of spiral.
Starting point is 02:26:04 So I bring up that story so very easily, and sometimes even without any malintent, sometimes that is like malintent, you know that malintent, you can slide the system. Okay? Now, with AI, right, so very simply, you know, just kind of like levels it a bit, right? The way a modern AI model works is there are kind of like two steps. One is a process called training. And the way I would think about it is you take all of human knowledge, all of the internet, imagine this sort of massive, like kind of witch's cauldron, you know, but imagine you
Starting point is 02:26:45 Imagine the cauldron span like multiple data in football fields. You dump it all, like every book ever written, every movie ever made, all of Wikipedia, if you Google, all of YouTube, you know, every bit of human knowledge ever gathered. You stick it in there. And then you have this algorithm, which I really want to talk about, try and make sense of it. So for example, it says that the word Sean, what should the word Sean be followed by? Okay. Let's say, I say, okay, word Sean followed by Ryan. Then you're like, okay, Ryan, and then what comes after that? Well, the Sean Ryan show. Operator, military, intelligence, CIA. Make sense, okay? Let's say after Sean, I say Michaels. Okay? Then you get title, wrestling, heel, face, stone cold. So it is kind of training and trying to make sense of all. all of this knowledge, okay?
Starting point is 02:27:46 So that's called training. Then there is a separate step where you, you know, you kind of take this kind of this large blob, and then there is a step called post-training where you are basically trying to make the model better in specific ways. For example, you're trying to make sure it gets better at say coding, or you're trying to make sure
Starting point is 02:28:09 it gets better at science, and you're trying to make sure it gets better at these, you know, in this kind of this very targeted ways, right? Like, you might have heard of this place called fine-tuning. You're trying to make it better in these very targeted ways. And so that, at the end of this, you kind of get this fully cooked model. And then you have inference, which is you go to chat GPT, or you go to GROC, or you go to Gemini on Google, or you go to Cloud. You type in a question, right?
Starting point is 02:28:36 And then it starts giving you letter by letter, word by word. Like, that is inference. Like, you're basically giving getting back answers. So the point, I want to kind of explain this whole thing, just kind of ground a lot of the other conversations we're going to have. But also, every step of this, right, like could be infected by ideology, right? Number one, let's go to, you know, the step of this big haldron. What if you only stuck, you know, left-leaning content in there? There's this data, gosh, I wish I could remember this to the top of my head, that a lot of the written Internet is left-leaning in nature.
Starting point is 02:29:12 And so if you just dump the internet in there, you could just get a leftward bias, just right there. So just by sort of the sources of data, your model could wind up learning, you know, just a certain ideology. Second is when you fine-tune these things, right? Like, for example, DeepSeek, which we probably wind up talking about, this Chinese model, right? There's a lot of evidence that it was fine-tuned to add Chinese ideology. Say nothing happened in Tiananmen Square, or what China thinks of Taiwan. Does that happen in that part of the process? Another part of the process you could add things are is when the model is trying to react
Starting point is 02:30:02 and give you a token and it is thinking, a lot of these models have instructions on how they should think, rules they should follow, right? You could set up a rule which says, pick the answer which optimizes, if you're me, I would say optimize for truth, or you could say pick the answer which optimizes for equity, right, in the DEI, right? And it's going to slightly slight the model some way, right? Or pick the answer which is least offensive, rather than pick the answer which is intellectually honest.
Starting point is 02:30:35 These are incredibly oversimplified examples, but you can't. You could see that, right? And so any one of these steps could have either deliberate or accidental ideology inserted. Now, in the Google case, and I could be wrong, you know, because I only sort of read some press reports later. I don't know what actually happened. I think what had happened is somebody by accident had added this idea that whenever the model is trying to generate photos of human beings to try and generate people of every race. But then you're like, wait a minute. If you get a photo of a Nazi, right, like, they're probably not like Asian.
Starting point is 02:31:16 Like that's not what like the average Nazi in Germany look like in 1940s, right? Or if you have a photo of a Viking warrior, right? Like, you know, they probably didn't look like me. Like that's not what Vikings look like, you know, when they shipped out from Norway. But this little thing was like, oh, whenever we ask for a human being, I'm just going to spread the human race out there. So what happens? You get the black Pope, the black George Washington, right? So, and I hope I'm not making this too complicated, but this idea is that these things can be very subtle.
Starting point is 02:31:48 They can be hard to find. They can be deliberate or inadvertent. And for me, you know, in this role, I think, look, one of the things we really care about is making sure that that is searching for truth and that there's no ideology of any kind. And if you look at the executive order, it doesn't say you need to optimize for the right. It says you need to be truth-seeking, optimize for the truth. And if you don't know the truth, express skepticism and tell us what you're reading. And my hope is that regardless of whether you agree with me and you on our political beliefs, you will probably agree that truth-seeking is a good thing.
Starting point is 02:32:30 So that's the hope. Mm-hmm. Mm-hmm. Mm-hmm. Wow. Yeah. It's a great explanation. Best I've heard, thank you. Thank you. Thank you. Here's the thing. The people who listen to this show are some of the hardest working in America.
Starting point is 02:32:47 So you should never have to choose between comfort and durability when it comes to your work boots. That's why I'm so glad I found Brunt. Brunt sells tough boots that feel great from day one and are seriously comfortable. right out of the box. No breaking them in, no sore feet, and they're built to perform. And Brunt believes in their products. You can wear them to work, and if they're not right for you, send them back. I love that they stand behind everything they make. Brunt was tired of the workwear brands out there cutting corners. You work too hard to be stuck in uncomfortable boots that don't hold up. So Brunt built something better,
Starting point is 02:33:28 Boots that are insanely comfortable and built for any job site. For a limited time, our listeners get $10 off at Brunt when you'd use code SRS at checkout. Just head to Bruntworkware.com, use the code SRS, and you're good to go. And after you order, they'll ask where you heard about Brunt. Do us a favor and tell them it was from this show. My days don't slow down. Between work, the gym, and time with the kids, I need eyewear that can keep up with everything I've got going on, and that's why I trust Roca. I've tried plenty of shades before, but these stand out. They're built for performance without sacrificing style. I've put
Starting point is 02:34:16 them through it all, on the range, out on the water, and off-road. They don't quit. They're lightweight, stay locked in place, and are tough enough to handle whatever I throw at them. And the best part, they don't just perform, they look incredible. Sleak, modern, and designed for people who expect more from their eyewear. No fluff, no gimmicks, just premium frames that deliver every single time. And that's why Roca is what I grab when I'm heading out the door. Born in Austin, Texas, they're American designed with zero shortcuts. Razor sharp optics, no glare, in all-day comfort that doesn't quit. And if you need prescription lenses, they've got you covered with both sunglasses and eyeglasses. One brand, all your bases. Roka isn't just eyewear, it's
Starting point is 02:35:10 confidence you can wear every day. They're the real deal. to upgrade your eyewear, check them out for yourself at roca.com and use code SRS for 20% off sitewide at checkout. That's ROKA.com. All right, so you remember, we're back from the break. We were talking about Twitter, how you got into there when Elon took over and basically eliminated the ideology of wokenness of Twitter. But, you know, something that we didn't talk about yet is A16 Z and Tristan Horowitz VC firm. And so I'm curious, you know, we've had some offline discussions about investments and things like that. But I'm curious, how did you get picked up for that? How did you find
Starting point is 02:36:00 your way into there? So, so A16 Z stands for Andreessen Horowitz. So A, followed by 16 letters and Z and A16 Z. It's kind of a tech thing. And so they're one of the leading venture capital firms in the world. I would say the leading venture capital firm. I think they're the largest in terms of money they have. And the founders are, you know, these two iconic figures in Silicon Valley, Mark Andreessen, who we spoke about, who invented the web browser, and Ben Horowitz, who was with him at Netscape and is a best-selling author
Starting point is 02:36:34 among many, many other amazing things. And in a lot of ways, they revolutionized, in my view, the way venture capital worked in Silicon Valley. This is a little bit of history. So when people sometimes think venture capital, they think shark tank. You come in, you kind of, you know, you pitch an idea, some like rich person in a suit
Starting point is 02:37:00 gives you some money or not. There's some element of that, but it's actually a lot more complicated. The history of venture capital actually goes back to whaling and fishing. So back a couple of hundred years ago, when you would get these sort of sailors or bailers, I don't know what, they'd be like, hey, we're going to go out and we're going to go into these stormy seas and take a lot of risks and be out for a few weeks, maybe catch them, you
Starting point is 02:37:27 know, whales and come back. And it's an incredibly risky investment, right? Like, half the time they don't come back. Let's just say, you know, the hazard rate was pretty high. and say they would go around and they would go to people with wealth and they'd be like, you know, why don't you stake me, you know, on this journey and I will give you a percentage of the whale I carry back, right? Which is, by the way, the word carry in venture capital comes from that. It was literally sort of, you know, the veil and sort of the thing that you carry back,
Starting point is 02:38:06 you would get a percentage of that carry. So even 200 years later, that word has kind of stayed on in sort of the risk capital discourse. And so the history of venture capital is super fascinating. And I think it's an essential part of our American innovation and Silicon Valley have existed. So in the 70s or 80s, what would happen is, you know, before venture capital, if you're an entrepreneur, right, how do you have an idea, how do you go get money? You go to a bank, you get a loan. Maybe you have some friends and family, you know, they give you. some money, but it was very, very hard to go to a bank and say, I have a really risky enterprise
Starting point is 02:38:49 and technology, which you may not understand, but I need a few million dollars, and by the way, may never see it again. And you may not have friends and family, you can do that. So I'm going to say in the 70s, there were a set of folks who wound up forming Sequoia Capital and Kleiner Perkins it developed the modern venture capital industry. These were folks who made some money from Silicon Valley, sort of the original Silicon in Silicon Valley. They have kind of been part of these big chip companies, and that's where the word Silicon Valley comes from.
Starting point is 02:39:24 They made some money, and they basically said, okay, we are going to stake these entrepreneurs, fund them on their journey, not out of sort of the goodness of their heart, because we know that when these companies do really well, we stand to make a huge return on our investment. But the thing, which I think is very different from the rest of the world,
Starting point is 02:39:44 is they were willing to lose their money. They didn't love it, but they're willing to take bets on these risky enterprises. So I think if you look at a lot of these amazing companies in Silicon Valley, Apple, for example, or Google or Netscape, they all had one of these venture capital firms
Starting point is 02:40:02 who were essentially taking bets on a totally unknown person and say, I'm going to give you some money. Famous example is Google. So Google comes by, I think, in 1998, right? And again, I'm dating myself here, but Google was not the first, the second, the third, maybe the fifth or sixth search engine.
Starting point is 02:40:19 You remember using Ask Jeeps, like us? Yeah, yeah. Yeah, right? Or yahoo.com, like, Yahoo is another famous one. And back then, you know, you would use these searching, people like, well, look, search is really not a business, right? We don't know how to make money off it, it's been sucked.
Starting point is 02:40:36 And then these two guys, Larry, and Sergey Brin basically come out with a new algorithm. They were these Stanford computer science guys. They came with this new algorithm called page rank. By the page rank is super interesting. It basically says that I'm going to think you are important if a lot of other people who are trustworthy also think you're important. And that in one oversimplified sentence is kind of how the origin of Google work.
Starting point is 02:41:01 But they were like, they don't have any money. They actually went to a lot of other companies and they said, do you want to buy our algorithm for a million dollars? for a million dollars. So they went to all these existing search engine companies. Imagine, by the way, you just bought, like, original Google algorithm for, like, less than a million bucks. And everyone just said, no.
Starting point is 02:41:18 So they don't know what to do. And so they, I think they came to two essential part of the Silicon Valley ecosystem. First is angel investors, and the second is venture capital. So they went to basically kind of these wealthy Silicon Valley guys. This guy named Andy Bechtosteim, and they said, listen, I have this idea. We need some money. And Andy, I think, kind of given them $100,000. They didn't have a bank account.
Starting point is 02:41:41 So they didn't know what to put the money. So they kind of kept it and started building Google. But so think of this construct. Two guys, crazy idea. Nobody thinks it's going to work. But the ecosystem and the culture of the valley was like, OK, if we think that you have a shot at this and you have done some work, there are probably enough people
Starting point is 02:42:01 capital are going to take a bet on you and probably going to assume that 90% of time this money is not going to come back, every once in a while you're going to create an iconic company. So Google then, of course, gets funded by another iconic firm called Planet Perkins, and then they go IPO, they obviously kind of become the huge giant they are.
Starting point is 02:42:19 So all of Silicon Valley has the stories of these amazing companies, but these venture capitalist firms were at the heart of it. So this story is very interesting because Andrews and Harvard's totally revolutionized in my mind and this ecosystem. Back in the day, venture capital firms were sleepy, you know, they were kind of behind the scenes. They would never say anything in public.
Starting point is 02:42:43 They didn't want any attention or controversy. The idea was you come in, you pitch us, you know, go on your way. The other thing was strategically what would happen is you would go pitch a person at a venture capital firm, a partner. Usually they had like five to eight partners. And that person would be assigned to you. So, and that person, if you needed some help, like maybe like, hey man, my company is failing, like, I need some advice. You had to go to that one person for advice.
Starting point is 02:43:09 If you're like, hey, I need a contact in the Pentagon. Like, do you know somebody at the DoD, that person better have a roller decks, right? But you're essentially with one person. So anyway, so Mark and Jason and Ben, they had their own experiences with bad venture capital firms, and a lot of bad words, right? They do a lot of bad things to founders. They, you know, throw CEOs out. And they also were like, listen, why is it that, you know, when we go,
Starting point is 02:43:33 we have to rely on this one person and we need all this other help. And they met Mike Owitz. You know who Mike Owitz is? I don't. So Mike Owitz is the founder of Creative Artists Agency, CAA, right? I would say along with WME, the two iconic Hollywood talent agencies, right? He's a guy who's represented like every Hollywood – I think he's retired now, but he's represented every Hollywood celebrity bestselling author We're at Front of CA. And CAA had this amazing strategy to win the market, right? What they did was, like, number one, they said, we're going to be very loud, right? Like, we're going to make sure everybody knows our name, we're going to be brash, we're
Starting point is 02:44:14 going to use these red binders, which are color attention grabbing, we're going to get people to pay attention to us. The second, and this is very interesting, they said, if you sign up with us, you are an A-list actor, that you are with an opposite, the reason you should sign up with us is You're not just getting me, Mike Owitz, you're going to get the entire firm's roller decks. And this firm is going to maintain roller dexes just for you. We're going to maintain a list of actors. We're going to maintain a list of producers, directors, you know.
Starting point is 02:44:45 So if you come in and you have an idea, you know, we don't need to go find the right editor. You don't need to hope that I have a friend, you know, who can, you know, direct your movie. We have entire teams whose whole job is to maintain a roster for you at all times. And they would go and win all these clients because you would go, what is your image of a Hollywood agent? A guy in a sharp suit, right? Like, you know, on a phone, quick talking, maybe he has a roller decks. CA industrialized it. They had some of that for sure.
Starting point is 02:45:13 But they're industrialized it because they're like, you come in, we can plug you into this machine, which is going to make great movies for you, make you favorites. So Michael was Mark Andreessen's mentor and friend. So Mark and Ben heard this and they were like, why don't we do this? for venture capital firms. Because let us say, a typical entrepreneur, we spoke about some in your 20s, 30s, you often don't know what it means to build a business
Starting point is 02:45:39 or you're good at one thing. Let us say you're good at building technology. You're great at computers like I used to be, right? Or maybe you just know your product area very well. You're great at making ice cream. You're great at building rockets. But are you the best CFO? Are you the best marketer?
Starting point is 02:45:56 Do you have a roller decks of, you know, government officials, you know, something goes wrong. You need all these other things who these young CEOs didn't know. Or let us see you run into trouble. You're running out of money, right? How do you restructure a deal? How do you close a big customer? How are you supposed to know these things, right?
Starting point is 02:46:13 So what Andreessen Horwood said, Mark was like, we are going to copy and replicate the CAA model, but for technology venture capital. Okay, so we are going to have a roster. We're going to have a roster of every single amazing CFO, in Silicon Valley. Every single amazing marketer. We're going to maintain a roster of potential board members. So when you come in, you know, 24-year-old with this idea, who has something working, we are going to plug you into the system. You want a CFO? We have everybody on speed dial. We have done them favors. So they will pick up our call. We can get you in. By the way,
Starting point is 02:46:49 they can also help you hire a CFOs. Often for young founders, they don't know how to hire executive talent. How do you hire a 50-year-old head of sales if you've never met a great sales leader before, how do you even know how to do that? We have the team who've seen every great sales to help you do that. So that was a different product, right, from classic venture capital. So number one, instead of one person's roller decks, we're going to give you the whole sort of system. The second thing, they were loud, they were brash, right?
Starting point is 02:47:17 They had press articles, you know, Mark Andreessen wrote this famous blog post called Software is Eating the World, which basically said that, you know, every business, you know, on planet Earth is going to have software underpinning it. And I think in some ways, I think he's been proven right. So there's a backstory. And this is kind of some of the tactics A16 used to, I would say, become one of the most famous, powerful firms in Silicon Valley. So how do I come in there? Right.
Starting point is 02:47:45 So Mark has this great strategy, which he calls harpooning. In the venture capital tech business, right, like if you sleep, if you are not paying attention to what the next generation is building, time will pass you by. You have to stay current. You have to stay always on the edge of what people are doing. And Mark, there are many others are very good at this, like Peter Thiel, Lansdale, who I think who's been here, amazing people there.
Starting point is 02:48:15 Mark was extremely good at it. He would have this tactic he called harpooning. And what he would do is if he saw anybody online who had written or done something interesting, He would send them an email, right? And getting an email from Mark Andreessen is like, he's a very notable technology figure. So you're just like,
Starting point is 02:48:35 well, what is that? And the reason you do that is like, I want to get to know this person before they become famous, before they build this next three data. It's a little bit like, you know, you see, I don't know, 15 year old who has amazing skills and, you know,
Starting point is 02:48:49 but I don't know what the legality of this as a coach is. You're like, I'm going to make sure I'm like building a relationship with this person because someday if they want to pick a school or a team, I have a relationship with this person. So, Mark was very, very good. I'm pretty sure even to this day he starts sending out these emails in the blue. And he harpooned me.
Starting point is 02:49:07 I had written a blog post in, I would say, 2012, 2013, and I get this cold email one day from Mark Andrews and saying, hey, I like this blog post. And I was like, whoa, you know. And this was very different time in my life, so it was quite shocking. Another blog post. Huh? Blog post. Yes.
Starting point is 02:49:25 blog post to Microsoft? No, I had left Microsoft, and this was about... That's what they found you with. Yes. Blog post. Yes, there's a whole pattern here, by the way. If you can take a slight tangent, which is, I think, one of the superpowers I think people can have
Starting point is 02:49:40 with a lot of, with very little effort, is putting content out online, doing what you do. I use right, I used to do video. Because when you put out content online, and if you're passionate and, you know, hopefully you know something about it, that somebody learns from, some of the best people in the world are paying attention, right? And I've noticed every sort of world leader in technology is always scarring for new ideas,
Starting point is 02:50:06 new people. And so there's been a repeating pattern in my career where I've written something and somebody was at the right moment saw it. And they were like, hey, this guy is doing something which I'm interested in. Let's reach out and make something happen. So one of the things I always tell people do, especially young people, is write things, put things online. These days I would say get a YouTube channel. Talk about what you're passionate about and somebody will find you. I know Elon, for example, has found amazing hires
Starting point is 02:50:33 because he went on a YouTube rabbit hole and was like, let's get this guy. He looks like really smart. You and I talk about like you follow somebody on Instagram and next thing you're like, hey, come on my show. And so I think writing what you do is such a great differentiator. Anyway, so I wrote something and Mark liked it. He sent me an email. We met up, we built a relationship for many, many years, that's one. The second thing was I started putting my own money, a little bit of my own money, into various companies. I put some into SpaceX,
Starting point is 02:51:02 put some into Alex Banks companies, scale AI, into a bunch of other companies which started doing really well. And Silicon Valley is a ecosystem run on reputation. If you are an investor who puts money into a founder and you are a jerk, you never show up, you never take the next phone call, you're not going to do really well because your reputation will spread.
Starting point is 02:51:26 That founder is going to tell his or her roommate. They're going to tell the next person and you will not fare very well. On the other hand, you know, if you take the phone call, if you wind up helping that person when they need you, if you're just not a jerk, right? And you respond to every single thing timely, you know, karma starts accumulating in your favor. And I just built up this portfolio of a bunch of these investments which started to do pretty well.
Starting point is 02:51:54 So I'd made a little bit of a name for myself, is what I would say. And I made a little bit of money. COVID happened. I was sitting at home, you know, collecting pasta and toilet paper like everyone else was. Do you remember the whole era?
Starting point is 02:52:09 Oh, yeah, I still have stocks. Oh, my God. What was your sort of, I can't believe we lived through that memory of COVID? What's that? What was your, I can't believe we live. lived through that memory of COVID. What is I can't believe we did that. I can't believe we did that. I can't believe we as a human
Starting point is 02:52:30 race or you here we did that. Like for me, for example, the fact that we spent months stuck indoors, not seeing other human beings, just bizarre. I can't believe we did that. Right. And what was that for you? A bunch of stuff. I remember, I fell for it. I completely fell for for about a month. And then I was like, this doesn't seem right. But I remember spraying packages off
Starting point is 02:53:00 at my front door with Lysol. I remember, so... Multiple hand washing you had to do that you may do, wash your hands? My hands were all cracking. Here's a funny story for you. So when we moved to Tennessee and my wife wanted to start a farm.
Starting point is 02:53:19 So we bought, We went and got like six alpacas, a bunch of goats, a bunch of chickens, and some ducks. And did you have any experience in farming before? No, no. Okay. No. We were, we had just started this right before it happened. And so, do you know what an alpacas is?
Starting point is 02:53:37 Of course. My mentor in Seattle had an alpaca farm. It's like a llama, right? Very docile animals. Anyways, it gets hot here, you know, could get to, you know, a hundred. degrees here and so every spring you're supposed to you're supposed to shave the alpacas and i mean that's what people raise them for anyways the fur right so we call this guy or my wife finds this alpaca wrangler woman and she comes down and she's she shaves the alpacas i remember at the beginning
Starting point is 02:54:13 it was all these people in italy were supposedly dying yes the february march of that year at that time frame. And they come down and I'm like, hey, make sure you wear a mask. We don't know where these people have been. I go down there after this is all going on and there's like four people on this alpaca. My wife doesn't have a mask on. They're shaving this damn thing.
Starting point is 02:54:38 And these two people that came with the, with the, whatever you want to call it, the groomer, whatever. The alpaca is getting fancy. These are like free-spirited people. they don't you know what I mean they just travel wherever and do this and uh two of them were like yeah we just got back from this big trip to Italy and I grab my wife and I'm like what are you doing you're gonna fucking kill us these people just came from Italy and da da da da da da and she's like holy she got all upset about it and uh anyways anyways about a couple days later you know because
Starting point is 02:55:14 I didn't I don't even I don't watch the news I got tired of the news long before people got tired of the news and I was just like we're just being fed the same bullshit over and over well then COVID turns up I don't have cable TV at the house I really I don't really watch anything smart and I just I don't want to be fed that shit you know what I mean because I figured out you know I mean I think everybody's figured it out that they they're telling you how to think what to think they're injecting thoughts into your head by that and it can manipulate the way you think so we got rid of cable a long time ago COVID pops up and I was like, hey, let's just see what's on air TV.
Starting point is 02:55:54 Only station we got was ABC, you know, and so I was like, well, let's just see what's going on in the world. So we were fed, you know, that garbage for a while. And then I started talking to some friends that still have news. And that's when I figured it out. I was like, all right, this is this is, this isn't a. about a virus. This is about something much bigger. And so, but that was kind of, those were my moments. Oh man, yeah.
Starting point is 02:56:25 Lasted about a month. Isn't it crazy that we all lived through that? We just had our first. I gotta be honest, it's fucking embarrassing. I mean, we didn't know better. We were all told this and nobody had gone through this before and there was so much fear and you're like, is it spreading in the air, not spreading in the air,
Starting point is 02:56:45 like the number of feet, everyone, and, And we just had our first child, like, a little bit before that. And we spent so many months just not seeing any human beings. Yeah. I was so bad in so many ways. And so bad for sort of elderly people I know who are just stuck. And just so it's both funny, but also I can't believe we all did that. But anyway, so I was sitting at home, you know, doing this, not seeing other human beings.
Starting point is 02:57:13 And everyone was on Zoom. If you remember, this was the era, but everybody was doing video Zoom meetings. That was the thing. And Mark Andreessen, he reaches out and he said, what are you up to? And I'd left Twitter because I'd gotten tired and I just wanted to do something else. And I was like, well, I'm sitting at home,
Starting point is 02:57:30 waiting for this pandemic to be over. I'm sure it'll be over in a few weeks, a little line of. I'm amazing at prediction, Sean. And he said, well, just come help us out. And what I did not know was, you know, they've been talking about it before a while. And they had somebody else who I think was going to step side. And so I became a part of the team. I became one of the general partners, you know,
Starting point is 02:57:55 along with Catherine Boyle. She joined after me, but there's about maybe 20 general partners for the firm. I became one of them and started, became a VC, started investing while also doing my podcast. But that's how my Andreessen Horowitz journey started. Wow. I learned a lot, by the way, Andreessen Horowitz about investing. They mark with, I think I learned a lot about how to be a good investor there, which in a lot of ways, I think, can carry over in other things in life. What is it that you see in a startup company that makes you want to invest? I mean, what are some of the points that you look for? Good question.
Starting point is 02:58:38 At the heart of it, that is the job. You are here to figure out who the winners are and put as much money as you can inside them. So the first thing is, you probably do not know what to look for until you have met hundreds of companies and founders. So, for example, right, like, you know, if I were to meet somebody from your world, from your background, right? I don't, without meeting a lot of people, I don't think I would know the difference between a amazing top-tier operator versus somebody who's not. just because I'm just not from that universe. And I suspect that if you came into my world, you may not know the difference between a top-tier engineer,
Starting point is 02:59:25 somebody is maybe just good and not great. And the only thing that kind of sets that apart is how you is put in the time and the effort to meet everybody. So the first thing, if you want to be an investor, is you've just got to talk to everybody. You've got to know who the great founders look like, who the great engineers look like, who the great builders.
Starting point is 02:59:45 marketers, who were they maybe, I need to meet everybody. When you meet people over time, you know, you build a spidey sense. Like, I'm sure when somebody reaches out to you, you know, just because of this podcast, you'd now have a little bit more of a spidey sense in terms of how to judge them. Are they legit, you know, are they full of shit, or somewhere in between, okay? And the first thing is unless as an investor, you've done your homework and met a lot of people, you will not know the difference between the next Google founder or these guys are just, they have nothing.
Starting point is 03:00:19 You should have done the homework. That's number one. The number two, the belief I learned is that technology is a sector where often the winners are outsized. Peter Thiel talks about this. Peter Thiel has this book called Zero to One, where he basically says that if you go to Palo Alto, There are in the Bay Area, there are probably like 20 Indian restaurants, 20 Italian
Starting point is 03:00:48 restaurants. If you invest in one, there is no way that Italian restaurant is going to become the only Italian restaurant in the United States. Just not possible. At best, maybe we have a chain, you know, you go to a few cities, but that's it. There is a cap on how big those Italian restaurants can be. No harm, no offense to Italian restaurants, but that's just the nature of the business. Technology business are different, right?
Starting point is 03:01:10 If you invest in the right company, they may be the only search engine people use. They may be the only social media network people use or pick your company, right? So there is a huge difference in picking the winner in a category versus not picking the winner in a category. For example, in 2005, Google dominated the world of search engines. Who was the number two search engine to Google? I don't know. Exactly. Nobody does.
Starting point is 03:01:38 Doesn't matter. right? Like, because Google just dominated. Same with Facebook, right? Like, or, and so there are these winner-take-all patterns which often wind up happening in technology where, you know, have you seen the movie Glengarry Gendross? No. Oh, okay, this is a classic movie, and there's this classic scene where there's a bunch of sales guys, and they're kind of running low on meeting their quotas, and Alec Baldwin comes in, and he's just sort of this amazingly famous. famous salesperson, he gives him a pep talk, right? And he basically insults them, insults that manlyhood.
Starting point is 03:02:15 And he says, you know, the guy who gets, you know, the most sales, the winner, gets this amazing car. You know what second price is after this fancy car? Set of steak knives, nothing, right? So often in the technology investing world, you know, there's a great scene. You should check it out on YouTube. It's a very similar dynamic where if you invest in the winner, right, like you're going to be in Google or, you know, Microsoft or.
Starting point is 03:02:39 or Apple or pick an amazing company, or you're in a company where you're like, oh, I don't even know who the second guy is, right? So how do you then figure out what the next goal is going to be? Well, number one, you have to really, really, really do your homework. And I think what the firm thought me
Starting point is 03:02:54 is that you can get the category wrong, of company wrong, but you can't get the company, you can't get the actual winner wrong. What does it mean, right? So, for example, about like, VR is a good example. Like, Oculus was a big, it was a reasonable winner in VR, but all other VR companies didn't really do super well.
Starting point is 03:03:15 They might come back now, but people invested money in ER. And what the, you know, some of the partners would say, like, that's fine. You took a bit on the entire category. You had the best company in the category, that's good, right? But what is not good is if you're investing in search engines and you did not invest in Google, because that is a difference between, you know,
Starting point is 03:03:35 being part of one of an iconic company and not being part of anything at all. So we often thought a lot about how do we make sure that you're investing in the winner in a category versus somebody else, and you have to wait sometimes until you know who the winner is, right? You have to be prepared, you have to know, you have to know all the founders.
Starting point is 03:03:54 You need to be, they need to know you. So that was, I think, another big dynamic that they taught me. The final, and I think the most important part. I have a question right quick. You know, when you're talking about trying to find the winner in a category. I mean, would it not be wise to invest in several companies within the same category?
Starting point is 03:04:18 Great question. Let me ask you a simple question. You had Palmer here. Let's see you're also invested in Palmer's competitor. How do you think he's going to feel? That's the caveat to that. Right? And so we had a word called leading conflicted.
Starting point is 03:04:36 I think the great entrepreneurs don't want you in bed with the competition. Now, of course, there are a lot of ways to kind of sometimes people get around it. You know, some people that I work, you know, in other places, they say, you know what? Like, we work with everyone equally. But, you know, one of the prizes is, you know, if you just back the winners and they don't want you often to work with everyone else. So that is a definite dynamic. But to be honest, there's a ways around it where some people will say, guess what, that's just the nature of me. I just work with everybody.
Starting point is 03:05:10 That's a price. You have to pay to work with me. That's all fine. But that was just the culture I grew up in when I was in decent Harvard, which is you pick a person and that's the only person you work with. And I believe a lot in that because I do think a lot of founders value loyalty. Like they want you to work with them because they are in a knife fight every single day. Not metaphorical, not a little one with a they're trying to win deals. they are trying to make sure that companies not, like, crushed or running out of money due to the other person.
Starting point is 03:05:40 They don't want you also helping the other person. They want to know that you are loyal to them. And I really believe in that loyalty. I think that matters a lot. And I also think, you know, people are not loyal, you know, the great founders don't wind up working with them. So that's a big dynamic. But it's a good question. But that's the answer.
Starting point is 03:05:58 I think the most important part I would say is you need to have a spidey sense for what a great entrepreneur looks like, and not look, like literally look, but how they operate, what they do. And I was lucky here because I got to spend time with lots of many amazing entrepreneurs from the, sometimes working with them, like, you know, or sometimes from the outside. And you sometimes see similar patterns across multi-grade entrepreneurs. Like, for example, I'll just pick one. Every single great entrepreneur is insanely fast. You know, they are urgent, right? If you, if you, if you I used to remember, you know, when I've been in teams which are not great, if you wanted to have the next meeting or a conversation about something,
Starting point is 03:06:41 they'd be like, yeah, let's go meet in a week or two. Maybe I put a PowerPoint deck together. You know, who has been in corporate America probably recognizes this. If you work for a great entrepreneur, they'd be like, well, let's talk in five minutes. Let's find the answer. Let's go, go, go. We don't have time to waste. They work 24-7.
Starting point is 03:06:58 They eat, sleep, and breathe this. There is a real sense of urgency of me. mission. And I think that's one of the things that I've learned to pick up on over time. And it's not the only thing. I think it's stable sticks. It's necessary but not sufficient, as they would say. But you build a spidey sense over time. Interesting. Interesting. How long were you there? Four and a half years until this job. What are some of your best investments?
Starting point is 03:07:29 Well, I pick one. And because it ties to the theme of what we talked about to social media, I just became convinced that centralized social media platforms are not good for all of us. Because if you have the team which doesn't agree with your politics running them, they can just issue orders top on down. And that's how I really got into crypto. I became a fan of this idea that crypto is a way
Starting point is 03:07:52 to decentralize these platforms. And instead of having one central company or algorithm which does everything, you know, you can have people have a say in this, right? So, well, personally, I'd invested in companies like SpaceX with Elon or Scale Air with Alex Bank. From the firm, one of my companies I'm really proud of is a company called Farkaster. And Farkaster is a decentralized social network.
Starting point is 03:08:22 And I did this investment, like, a few years ago, and it is done by this ex-coin-based product builder named Dan Romero, who's awesome. But the idea was that, imagine if, you know, with Twitter, if Twitter was somebody, let's say you like Elon, you agree with this policy. Let's say somebody in the future Twitter is run by somebody who doesn't agree with you, right? Or say it's like, I just hate Sean Ryan. I want to see his account disappear. Maybe YouTube does that to you. In Farcaster, they built a cryptographic protocol, sorry, protocol on, you know, on top of the blockchain, where you own your own your.
Starting point is 03:09:00 social graph and anybody can build a client on top of Farcaster. So what does it mean? Like right now if Twitter decides to ban you, YouTube decides to ban you, you're done, you're toast. In Farcaster, you're like, you know what? You can't ban me. I'm just going to go over using the same client to this other person, I'm going to take my follow graph, I'm going to take my followers, I'm going to take my content, I'm going to go elsewhere, right? By the way, the interesting thing is this is how the internet used to work. Let me ask you something. What What was your first email, don't tell me your first email.
Starting point is 03:09:32 What provider did you sign up for your first ever email address? Hotmail. Great, okay? And then you probably switched, right? You probably went to Gmail, etc. But when you switched, there are a lot of ways to take your email to other places. You could forward your email to other places. If you wanted to access your email, you could do it on your phone, right?
Starting point is 03:09:52 You could do it using the official hotmail.com website, or later on, you could use it on the iPhone or on your own desktop client. So you signing up with the service was not tied to the actual application you were using with it. And you're not tied to it. You could, sometimes you would take your email address elsewhere. And with social networks, you lost that. If you want to use Instagram, if you want to use TikTok, you have to use the official app. And there's a lot of reasons as to why advertising as a business model.
Starting point is 03:10:20 But what Farcasters are trying to do, and I think others are trying to do in crypto, is let's bring that original internet way back where you own. your handle. Like, so for example, I want a world where even if Neil Mohan, who runs YouTube, gets really pissed off at Sean Ryan, you can take your audience and your subscribers and just go elsewhere. And they can just follow it elsewhere, right? That's a possibility now. It is a possibility. Now, it's early days. And the reason, you know, so you have to figure how to make it happen, right, technically, economically. So they built this kind of
Starting point is 03:10:55 crypto protocol, which to make it happen. And they built all these online. internet clients. It's very early days, right? But I love the idea. Because for me, that is the spirit of the internet I grew up in, like, late night. Because imagine in a world where I was late night in my computer back in Chennai, like 21 years ago. And I was told, oh, wait, before you can write any bit of code, you need to call up a salesperson in Microsoft before you can write code. Like, I would never had anything. And so I think this harkens back to an original ethos and spirit of the internet. And I think of crypto. which is that you own these things. You have a say, you have a stake. Let me ask you another example, right? How many subscribers do you have on YouTube right now? Almost $5 million. Great.
Starting point is 03:11:40 How much money do you think YouTube makes per year? How much money do I think YouTube makes a year? In ad revenue, for Google. Man, I have, I've not even got a number. Let's say it's $5 billion, right? Again, I don't mean to pick on YouTube. I think they're amazing, right? How much of that money do you think you are owed?
Starting point is 03:11:56 or how do you have you even thought of that right and you know because you are a stakeholder you're contributing to this platform if you use another social media platform you are contributing to this platform and I think one of the promises of crypto right is that well let's give you know everybody you could be somebody with five million subscribers could be five subscribers a stake in the platform a stake in two ways one financially if you know if YouTube, Instagram, TikTok, make money, you get money. Second is stake in terms of how you want to experience it. If you want to use YouTube.com, you want to use another app, you should be able to go for it.
Starting point is 03:12:35 If you want to use a different algorithm on the right side or if you want to use a TikTok with a different algorithm, you're able to go for it, right? Like, I want a world, for example, where one day you open up Twitter or TikTok and you're like, well, you know what, I want to pick this algorithm. I don't want to pick just the algorithm they give me. Imagine you have like a shopping market of algorithms. You could pick from. Crypto makes all of this possible.
Starting point is 03:12:56 So, anyway, so that is one of the things, you know, I was a very deep believer in. Look, I was lucky to work with some amazing founders and entrepreneurs, some of the best, deepest relationships. These are friendships, I'll have forever, because they took a bet on me as much as I took a bet on them, and I'm grateful for all of them. Wow. Very, what's that, forecaster? Forecaster.
Starting point is 03:13:20 That's interesting. That's very interesting. I've not heard of that. Yeah, well, it's already days, but you know, it's one of those things where I think, you know, them or somebody like them, I think it's one of the thing that just needs to exist. Yeah, yeah, that sounds genius. Every day I go to bed, I'm like, oh, this could all end by the morning. You should be like, Neil Moore is a nice guy. You should be like, you know what, I put in some work.
Starting point is 03:13:43 I deserve a little bit part of this. I have a say in the algorithm. And right now, I'm sure they listen to you. You can get a call and they'll probably listen to you, but you don't have an actual say. And I think that's the promise of crypto. Yeah. Wow, that's genius. I love that.
Starting point is 03:13:57 So let's move into AI. How did you get picked up for the position? Man. Senior White House advisor on all things AI. It is a pure, I would say, accident in a lot of ways. So going back a bit, I had a lot of people in D.C. you know, have long careers in public service. They have
Starting point is 03:14:24 a lot of aspirations to be here. I was not one of them. I was happy as can be back in Silicon Valley, back in the technology were investing. I was thinking of starting a company. I was thinking I was starting my own firm. I was just off
Starting point is 03:14:40 doing, you know, my thing. Just because I'm sure a lot of people will agree, D.C. just felt like this other universe. Like, you're like, well, a lot of things happening here, but I'm here doing my thing. And, but what went up happening is about a year and a half ago, you know, like everybody, I had gotten really involved in AI, I was investing in AI, but there was this narrative
Starting point is 03:15:04 that picked up about AI just killing us all. There are a lot of choices out there when it comes to cell phone service, and it feels like more are popping up all the time. But Patriot Mobile isn't just another option. They're different. a company built by people who actually share your values and who are committed to doing things the right way. They're also ahead of the curve when it comes to tech. Patriot Mobile is one of the only carriers with access to all three major U.S. networks,
Starting point is 03:15:33 which means reliable nationwide coverage. You can even have multiple numbers on different networks all on one phone, a true game changer. They offer unlimited data, mobile hotspots, international roaming, internet backup, and more. everything you'd expect from a top-tier carrier. And switching couldn't be easier. Activated minutes from home, keep your number, keep your phone,
Starting point is 03:15:57 or upgrade if you want to. Go to patriotmobile.com slash SRS or call 972 Patriot. And don't forget to use promo code SRS for a free month of service. That's patriotmobile.com slash SRS or call 972 Patriot. I don't know how much you paid attention
Starting point is 03:16:16 to the whole like doomer argument on AI maybe a year and a half two ago. But there was a whole school of thought which picked up. I'm going to say in late 2023, sometime on that frame, which basically said that, oh, we should just stop working on AI because this is going to become this superhuman intelligence, which just takes over all of humanity. And I disagree with that, which we can really get into. The other thing that they started doing was a lot of these folks started influencing government,
Starting point is 03:16:45 influencing the Biden administration, influencing various legislators, including California. And what they said was, let's find a way to stop or slow down AI in any number of ways. And I was paying attention to this, but the one thing which really kind of struck me as very wrong was they tried to ban this thing
Starting point is 03:17:07 called open source. So open source, by the way, just for history, is the software world through the last 30, 40 years, has had two camps. One is closed source, which is somebody like Microsoft Windows, they build it in Redmond, they ship your product, or Apple's OSX operating system, you use it. The other is open source.
Starting point is 03:17:26 The classic examples would be Linux, which I'm sure you've heard of, or the Android operating system, where there is source code which any of us can go look at, modify, and then contribute back to. And open source is very important for a couple of reasons. One is it is kind of part of the spirit of the internet. It is how people kind of innovate, build on, you get kids, you get academics, you can download, you know, the latest and greatest, and you can build on it.
Starting point is 03:17:52 It's kind of this spirit of how all these engineering ecosystems at the heart of Silicon Valley. The second is open source is safer. There is a great law called Linusis Law, named after Linus Tarvels, the founder of Linux. It says, with given enough eyes, all bugs are shallow. It's a different way of saying sunlight is the best disinfectant. And it is, like, if you have the world looking at your code, you can't hide a security vulnerability in there. We're going to find it, right? Because one smart person might miss it, but a thousand smart people looking at it, somebody's going to find it.
Starting point is 03:18:26 So what has happened over the last 20 years is if you look at the heart of the things which power your phone, things which power security software, a lot of it is open source. Because people like, I have trusted it, a lot of its engineers working on it. it was very important. In the AI, there was a growing effort to build open source, open-weight models. And again, bear with me because some of these, you know, I think Chad GPT is probably the first time people heard of these model AI models. AI as a, and, you know, there's been a long history of AI development. AI started in the, I'm going to say, in the 40s.
Starting point is 03:19:06 40s? Yes. AI is one of the reasons computation was inventing. There's this guy, Alan Turing. You know, you might have seen in the imitation game with Benedict Cumberback. He invented the Enigma Machine or as a big part of it, which helped the British with the Nazis in terms of ciphers. He was a mathematical genius in the 40s and 50s, and he invented a lot of modern computation. He invented two really important ideas which underpin all of computing.
Starting point is 03:19:35 One is called the Turing machine, which basically says that. anything can be a computer if it can decide to between option A, option B, or it can follow an instruction. It is a heart of every computer. But the second, which is even more interesting, AI, there's something called the Turing Test. Have you heard of this? No. The Turing test is, okay, I'm sitting in front of you. Imagine there's a door in front of us.
Starting point is 03:20:00 I couldn't see you. There's another door. There's one of, behind one door is a human being. Behind another door is an AI, right? The Turing test is, can I, as a human being, tell the difference and know who's the human and who is the AI, right? And it's kind of, it was always seen as sort of this theoretical, hypothetical test, right? But the thing about AI development, it started in the 40s and 50s, and it has always been, I'm going to call it the holy grail, right? It has inspired people.
Starting point is 03:20:28 They came into the industry. For example, in the 60s, there was this guy, you know, John McCarthy, you know, he invented these amazing programming languages. called LISP, all because he wanted to build AI, right? And in every decade, there were people trying to figure out AI. And the 80s and 90s, people started really interested in the idea for neural networks. This idea was like, let's figure out how the brain works.
Starting point is 03:20:53 And then let's try and mimic it in a computer. And maybe we get AI, right? Like for some different definition of AI, right? Maybe you think Terminator and SkyNet. Maybe you think 2001 a Space Odyssey. Sorry, sorry, Dave, I can't do that. You think data from Star Trek, whatever it is, that's some form of AI.
Starting point is 03:21:12 Now, so basically, people have been trying for years. Now, the challenge the AI has been, over the last 50 years, you would see these ups and downs. Somebody would get really excited about a particular idea, right? It would feel promise for a while. People would build PhDs, they would build companies, and then one day it'll run out of Steam. Because what would happen is this piece of AI
Starting point is 03:21:32 that work for one idea would not work for another idea. Maybe you can detect cats, but not dogs. Maybe you can translate English, but not French. It doesn't, it wouldn't scale. And so, so all these ideas, what happened, they have this little kind of hill and, you know, kind of this momentum and energy,
Starting point is 03:21:50 and then disillusionment. And people leave the industry, companies would go out of business, and this was happening time and time and time again. Even in the 2000, neural networks, which were super interesting and hard. Every academic was interested in the 80s. In the 2000s, people like,
Starting point is 03:22:05 I don't know about your old networks. We've been stuck for 20 years. We haven't made a breakthrough. Instead of that, let's figure out alternative mechanisms. There are other things that people are doing. Now, there are two really key moments that happen in AI, which one of the questions I think people should ask is like, why is AI interesting now?
Starting point is 03:22:25 Why not in 2010? Why not, why was Chad GPD not built in 2005, right? Like, why is it being built now? So I think there's a long history of technical accomplishments happen, and the two very important moments. One was, there was something called AlexNet, which was helped build by this guy,
Starting point is 03:22:43 Ilya Satska, one of the founders of Open AI in 2012. But the most important thing, I would say, and this thing should, I think, someday win a Nobel Prize or something, is this paper that came out of Google in 2017, and the paper is called, Attention is all you need, okay? I think this is going to be historic paper.
Starting point is 03:23:03 I think this is going to be as important Einstein's theory of general relativity, it is iconic. And the reason why it is important is that for the first time, you know, we found a mechanism that just continues to scale and work, right, with neural networks. Remember what I said, until then, there are all these stop and start-stop attempts. You start somewhere, you chose some promise, you would stop. With transformers and attention, these bunch of Google engineers figured out this thing, and they didn't know what they had at first, but it turns out like it just kept going
Starting point is 03:23:35 And it had this magical property called the scaling loss. And what it said was that if you give this more data and more GPUs, more computers, it just kept getting better across the board. And it won't stop. So far it has not stopped. And the reason why this way is this important. Every AI algorithm in the past had stopped. It worked for a while and then it did not scale.
Starting point is 03:24:02 People tried to be smart. They'd be like, can I detect the human face in a particular way? Well, yes, but then people are different faces. Well, you can do the face, can detect the bicep. And it just kept failing. But this algorithm, right, as long as you give it more data, more to learn from, and then more computers, more data centers, more energy, it just kept getting better. Okay? And so it was built by Google, but OpenAI, which had been started by Sam Altman and Elon Musk and a bunch of others, you know,
Starting point is 03:24:33 They kind of ran with it, and a few years later came out with Chad GPT, which I think is probably the real moment where people are like, oh, wow, this is really powerful and so on. So just a lot of history in terms of how we got here. Why are we even here at this moment? Now, with Chad GPT, it's a closed model. When you use Chad GPT, Brock, Anthropic, Google. What does a close model mean?
Starting point is 03:24:58 You type in a question, or maybe you give it an image, you give it a video and then generates an answer for you. But you can't really see what it is doing behind the scenes. You can't run it on your laptop or you can't run it on your own data center. It is closed, not open. But that's perfectly fine because they have a business model. They spend hundreds of millions and billions of dollars on this. They want you to pay a subscription fee and use chat, CPT.
Starting point is 03:25:26 But a set of companies started building open source models. And these are models where, like, you could take. Chad GPT, but a smaller version of it, but run it on your laptop, right? You could run it on your phone. Mera was one. They had this model called Lama, okay? Why was this interesting to how I got in here? Why this whole roundabout thing?
Starting point is 03:25:46 A set of people got really convinced that open source was dangerous. They were convinced themselves that it was going to help the Chinese, that somehow it is going to make the world unsafe, and they tried to get California as a state to basically ban open source. So I was sitting here, right, you know, mind my own business. And I was like, man, this is just wrong. Right, like, because this is the way the internet should work. This has been the heart of innovation.
Starting point is 03:26:10 This is how you get multiple small entrepreneurs and not just a few big guys. Not that I have anything against big guys, they're awesome, but I need multiple entrepreneurs. This is just wrong. So me as somebody who had no interest in policy, I started getting involved in these battles. Okay, so I started joining the right groups. I started putting my hand up.
Starting point is 03:26:28 And I was in the United Kingdom at the time. I was helping Andrews and Horvards grow internationally. I had a meeting with the then UK government. They had a bunch of people. And they asked me, hey, they asked the whole group, can we make this open source model public? This was two and a half years ago. It was super safe, obviously.
Starting point is 03:26:47 I said, I was the only person in the room who said yes. I mean, I said yes. This person next to me looked at me and said, you have just killed all of our children. I was like, whoa. That's a bit much. I remember thinking, wow, these people have infiltrated the highest reaches of government. And they have sort of scared the world into thinking that this AI is going to take over the world
Starting point is 03:27:12 and just kind of take over humanity for reasons by which I can sort of dispute and, you know, why I think is untrue. But that kind of got me personally motivated. So fast forward, the election happens. And I was very, you know, I was close to David Sachs, the AI is. are. And I said, listen, I have all these ideas for you because I think this is one of the most existential questions. The Biden administration has taken so many wrong turns. They have hurt the American AI ecosystem. They have caused us to almost lose the race to China in a bunch of different
Starting point is 03:27:48 ways. And I think there's an existential issue. And David tells me, well, come to the White House and help fix it. And I was like, whoa. And I didn't know that was an option. And for me, this country has just given me so much. And I was like, here's a moment in time where I have the chance to give something back. And I've been incredibly fortunate where, like, imagine you have some skill set in some area. Right. And all of a sudden, you get a chance to help your country with that particular skill set. Right.
Starting point is 03:28:27 I was like, I don't know when this chance will ever come again. I was convinced the country was going down the wrong direction on AI. I thought the stakes were incredibly high. Like, if we get this wrong, which I thought the Biden folks were, we would lose this race to China with catastrophic consequences. And here I was with this opportunity to, well, step up and try and do something about it. So I flew to Mar-Lago, and I got a call saying, hey, you know what, you're on the team. This was, I'm going to say, mid-early December, a little bit after the election.
Starting point is 03:28:59 Fast-forward a bit more, the president gets sworn in, and a couple of days later, and I suspect this was time, China comes out with this model called DeepSeek. Have you heard of it? Oh, yeah. Oh, yeah. So deep seek is super important because it is a open source model. Okay, so first of all, a lot of the people who wanted to stop open source said, well, one of the reasons we want to don't have open source is because we don't want to help China. It turns out that the Chinese are actually way ahead. And they actually had a genuinely a fantastic model that surprised the world. Okay.
Starting point is 03:29:33 At the time, it was the only reasoning model, a model which can think and reflect on itself, which was a It was not open AI. It was ahead of so many other models that America had. It captured everyone's attention. And I don't want to take any credit away from the team that built DeepSeek. There was this team of basically hedge fund guys who had, you know, who were very, very good with building on top of GPUs. And it turns out that a lot of the skills that you need to build great models is programming GPUs very well. So they build some innovative cool stuff. So I always tell people like DeepSys, Sikh has some great ideas that we hadn't seen before, but it is, I think, a Sputnik moment. Wow. Because it showed us that not only are we not like... Scared everybody. Yes, and because not only are we like, not like far ahead, we are super close. And we are on this wrong trajectory where we could just wind up losing.
Starting point is 03:30:32 So we talked about all these companies, Google, Apple, etc. Imagine if in 1998 China built Google, and that's all we use. every single day. China built the iPhone. That's all used every single day. And AI could be a much, much more important technology platform than those things. And we were off to the races. I remember the very first day coming in. I didn't even have sworn in yet. So they had to give me a badge and do all these things, briefing everybody. And then the president comes out that evening and he says, we need to compete. We need to unleash American entrepreneurship. So that was my, I think the day before my first day, the next day I started, and we were off for the races, man.
Starting point is 03:31:14 Well, congratulations. Thank you. Late congratulations. But, you know, you had mentioned, I want to talk about, you know, you were talking about the DOOMers in AI. What is, I mean, I know a lot of the concerns, but I want to hear them, you know, what do you think the concerns are? What is it?
Starting point is 03:31:30 So let me sort of try and be intellectually honest in Steelman what some of the concerns around AI are. There are, I think, there are several classes of. concerns. The first, maybe the most important one, and this is not from the Doomers, is going to take my job. That's important, but that's not what the Doomers are talking about. There's another set of concerns, which is AI could build maybe a new kind of biological weapon, a new kind of, you know, nerve toxin, right? And those are very legitimate, serious threats there. But the real Duma argument was this idea that as AI keeps improving, that at some time,
Starting point is 03:32:13 AI models will start improving themselves. So instead of a human being, being like, all right, I'm going to control this AI, I'm going to try and make it better every single piece of time. At some point in time, a model will start to improve itself. And there's this word in sort of this this AI debate which is called takeoff or fume, which is kind of the sound of a rocket taking off. Whereas if you hit that moment of improvement, instead of AI just becoming better, better, better, better, it just goes, fooom.
Starting point is 03:32:49 And so their belief is if that happens, what are the results? Well, you might get AI, they would say, that is not aligned with our hopes and beliefs. Not because AI is evil, like if you see an ant, you're not aligned with this interest. We just, not like we're particularly against ants, but we just don't care as much. And they worry that AI might think of us as ants. Maybe there's this famous thing called paperclip maximizing.
Starting point is 03:33:15 Have you heard of this? No. Oh, so paperclip maximizing is the idea that the AIs may not want to kill us, but they may not really know what we like. So they may put us in a job where they say, you know what, make just amazing paper clips. Because they think humans are happy. And we're like, no, no, that's not. not what is the emotion satisfying job.
Starting point is 03:33:35 So it is kind of used as a way to say AI might become this all-powerful, all-knowing intelligence, which then is going to be smarter than any one human or any one country, and then just given its knowledge and power who just control us and may not have our best interests at heart. I think I've done a reasonable job of conveying the scenario. And so, if you believe that, and they had some other concerns, especially the Biden people, they believe that, well, if you believe this, you need to make sure that we slow down, we don't get anywhere close, and we need to make sure that only America can build these AI models, no other country. Because why would you risk some other country having this superhuman intelligence?
Starting point is 03:34:24 They would often compare it to a nuclear weapon. And they would often compare a GPU, a graphics card, one of these thousands of GPUs, which are in a data center, or hundreds of thousands, to plutonium. They would say, it's like collecting plutonium, and you don't want to have another country, even an allied country, having a nuclear weapon before you. So they had this thought, I would say, fear of we need to make sure that if we hit AGI, I'm sure you've heard of the word AGI, artificial, like, it needs to be us first, and we'd be kind of scared of it. They also, and I think, so this was, I would say, some of the school of thought around the Doomers, and I just didn't buy any of it. And the reason I didn't buy any of it is that we are so many years into these AI models, and there are absolutely few things.
Starting point is 03:35:17 There are absolutely no signs of takeoff. In fact, we're recording this in September 2025, I would say for the last several months, every model has slightly leapfrog over another, and there is no sign that any one model is taking off. There is no one model they call it. In fact, what is happening is these models are increasingly
Starting point is 03:35:38 specializing. You have one company which is building amazing models for code. You have another company which is building amazing models for maybe friendship or companionship, and so other models which are great for scientific discovery
Starting point is 03:35:55 and thinking. So instead of having this one model which is becoming the superhuman intelligence and surging ahead, you're having these cluster of models, you know, all sort of giving people great benefit, but not showing any signs of takeoffs. That's number one. The second reason why I disagree with a lot of the doomers is that it fundamentally underestimates human ingenuity. Human beings through history have been able to harness technology, right? The wheel, fire, right? Like, electricity.
Starting point is 03:36:31 Have you seen these videos of, you know, when people would try and scare people over electricity by, you know, by killing elephants? Like, you know, there was this whole thing where there was a lot of fearmongering about certain, like, one form of electricity versus another. So they would do these incredibly barbaric things.
Starting point is 03:36:47 They were like, we want to electrocute this elephant. This is why electricity is not safe for you. There was a lot of fearmongering, right? And time and time again, human beings found a way to harness technology, even with nuclear weapons, human beings found a way to harness nuclear energy. And of course, there's a lot of doomer thought against that, which we can get to. Same with the internet, right? We've a lot of downside.
Starting point is 03:37:10 We've found a way to harness it. So I think if you think about AI, it fundamentally underestimates human ingenuity. Humans are not going to allow a one all-powerful model to become superhuman intelligence. and you know what, without being like, you know what, we're gonna have a say in this. We're gonna try and stop it, way, way before that happens. They're probably gonna have a bunch of other AI models
Starting point is 03:37:32 which stop it. So I think just fundamentally underestimates humans, all of us. You know, our human spirit, our creativity, our ingenuity as individuals and our cities. That's number two. The third piece is that, you know, instead of having, you know, what I think AI has become, is we, and I wanna come back to us later
Starting point is 03:37:52 to the jobs question, we absolutely need humans at both ends of the AI model. We need a human being to give it context and input. Like I was telling you before I showed up today, I went and asked a model, hey, I'm going on Sean Ryan's show, I am the White House AI advisor, what should I talk about? It's pretty okay, but they didn't know me really well, it didn't know you pretty well, right?
Starting point is 03:38:14 But if I'd given a lot more input and it worked with it, it would have done a lot better. The second thing, which we absolutely need humans for is on verifying the output, which is when an AI model gives you an answer, be it a diagnosis to a doctor, be it a suggestion to an accountant, or maybe a suggestion to somebody manning a drone, you will absolutely need a human being to check it, to verify it, to specialize in it. So when I think of AI, I think of like the Iron Man suit.
Starting point is 03:38:46 It amplifies you. It is a fantastic assistant. It does not replace you. So, but anyway, so all of this, I think, disprove the dooms and their version of this takeoff risk. But along with this, I think the Biden folks made a bunch of key errors. They felt that, one, we have to stop this AGI from happening anywhere else in the world. Second, they were convinced that there was going to be a shortage of AI chips and GPUs forever.
Starting point is 03:39:18 that China just can't innovate. They just can't build amazing models or amazing chips. They were just wrong on all of it. And if you think about today, you know, people can just get AI GPUs from Nvidia, from AMD. There's a bunch of other companies you can just get them. There's no more shortages, no more supply constraints because the semiconductor industry is very, very good at reacting to shortages and honestly is finding a way to make money.
Starting point is 03:39:47 But the second thing is they really underestimated China because since they really believe these AI models can be constructed only by a certain number of people in San Francisco, they did not think that some smart set of people around the world could build a deep seek. And it totally shocked them. And nobody predicted deep sick. So as a result of all this, I just think like the whole doomer narrative, you know, set the country on a wrong path. I think almost really hurt us in the race against China. And a lot of what we have done in this administration is try and undo that. Interesting. I mean, a couple of questions going all the way back to different AI specific models. You had mentioned one for friendship and companionship. What is that? Well, I mean, I would say it's more of a use case. But I think if you look at GROC, there are all these sort of characters with certain personalities people use,
Starting point is 03:40:43 sometimes not safe for work but I think when GPT5 came out recently a lot of people were upset because they felt like they had built a friend in the previous model with GPT4 it had a certain tone it had a certain way of speaking to you
Starting point is 03:41:03 and I do think and they kind of projected this idea of a relationship and sometimes some of these AI companies what they're doing is they're specializing in tone Are we the friendly model, right? Or are we going to be just very clinical and cold in how we respond? So that is one dimension in which you can differentiate. But a lot of other differences in which you can differentiate are, for example, capabilities.
Starting point is 03:41:29 Coding is by far one of the most lucrative, most interesting capabilities in models right now. Have you heard of the phrase vibe coding? No. Oh, okay. So when I wrote code and everyone wrote, code. The way you did it is you typed in a piece of code, a program, you gave it to the computer, whether it worked or not, and then you took it back, and they wrote more piece of code and that's it, right? These days, wipecoding is basically the idea. If you
Starting point is 03:41:56 want to learn, say, a new programming language, right? You went and learned how it sounded, how it looked like. You went to various corners of the internet. You figured how to use it idiomatically, just like you learn a new language, right? It's one thing to look at the French dictionary. It's another thing to be able to speak in French idioms, where you're where, you know, people are like, oh, I think this person knows French. And you have the same with programming languages. But AI models are incredibly good at coding. There are a couple of reasons for this.
Starting point is 03:42:22 One is that there's a lot of code on the internet. So AI models are just trained on a lot of code. The second more interesting reason in my mind is that coding is a way where the models can actually get better by just by themselves. They can basically generate some piece of code. And they were like, is it good? Let me run it? oh, it was not good, let me make myself better. So coding is one of these things
Starting point is 03:42:46 where they can just learn much better without having to have human input. If they wrote a poem, it's much harder to basically, hey, is this poem better than that poem? But with code, there is an objective answer. Now, I'm oversimplifying, but there's a couple of reasons why coding has gotten just dramatically good on these models.
Starting point is 03:43:04 So vibe coding is this idea where you ask these models into essentially generate code for you. You can do this right now. Have you ever written code at all? No. Oh, amazing. Let me get you with this.
Starting point is 03:43:17 This is one of the most, you're going to teach me guns and I'm going to teach you code. Okay. Let's do this, right? One of us is going to look way more badass than the other. But historically, even five years ago, I was like, well, I'm going to send you a book. I'm going to send you a YouTube video. Because somebody would say, well, let's say, give me an example of something you want to do. Maybe an, do you have an, does this show have an app or a website?
Starting point is 03:43:40 You have a website. We have a website. Do you have an app? We have no app. Great. Let us say you want to build an app for the Sean Ryan show, right? It notifies you in this new episode, collects email addresses, right? Like, you know, you can sort of sign up to get early access, all these kind of things that app might do.
Starting point is 03:43:57 Now. We should build that. Please, you should. Or somebody watching should build it and get your attention. But, you know, now what you can do is I can teach you do it right now. What you would do is you would open up one of these models. you would say, I know nothing about computer science. I have no background in writing code.
Starting point is 03:44:16 Take this YouTube channel and build me a mobile application on my iPhone, runs on my Android, which does all the things I just said. That's it. And what it is going to do is going to start generating code for you. It might ask you some questions. Like, how do you want the screen to look like? Maybe you say, look, I want the screen to have this color,
Starting point is 03:44:35 have this functionality. It's going to generate that for you. And then it's going to maybe even run that for you. Right? And without maybe even you individually writing a single line of code yourself, you could have today a fully functional mobile app. People actually build much more sophisticated experiences. And really, you should try this tonight. Right now, or after this, you know what, like I'm going to show you maybe a demo,
Starting point is 03:44:56 and you should try this right now. And I think this is such a superpower because computers were often sort of this arcane thing where you're like, wow, I'm going to go to school. I have to teach myself this. But with models, anybody and everybody can just get into it and they can amplify. themselves, and you can focus on the thing you want to do, building a great app for the Sean Ryan show. So, wide coding is this idea where, you know, instead of spending a lot of time, trying to think of what the right code to do, you basically tell the model, here's the code I'm trying to build,
Starting point is 03:45:27 and what are the models, say, all right, yes, yes, yes, just keep going, keep going, keep going, and it's sort of a little bit of a tongue-in-cheek idea. The idea is, like, you don't get perfect code. You obviously, if you're writing this in production, if you're writing this for running inside a, I don't know, like a bank or a nuclear reactor, you absolutely want to make sure you check it and you know exactly what it is doing. But you're doing it for fun, it's great, right? You can explore things, try out a new idea, you know, maybe build something as a hobby.
Starting point is 03:45:57 So wipe coding ideas that you can just sort of go with the model. So that has fundamentally changed writing code where I don't know the exact stats, but a lot of tech companies, over, I would say, like 30, 40, 50 percent of their code is not. written by AI. Anyway, so my point is, going back a little bit, AI models instead of having this one model which takes over everybody and becomes this sort of this gigantic, you know, terminator brain, you know how multiple models which have specialized, like, I'm the great quarter model or I'm the great, you know, one of the great personality.
Starting point is 03:46:31 And so we see no signs of takeoff happening. So I think the DOOMers in my mind have been completely proven wrong. Is takeoff a, I mean, would you say it's a possibility? Well, theoretically, absolutely, but for me, you have to come back to science, right? You have to come back to the scientific method. So I would say you have to show me proof that it can happen. And every data point we have now is pointing the opposite direction. And what we were doing is in fear of, in my mind, this theoretical scenario which has had
Starting point is 03:47:06 like no existence proof off, we were basically shooting ourselves in the foot. We were trying to stop AI, we were trying to ban open source AI, we were trying to stop our AI from being used by other countries, our allies. We were trying to stop our allies from using our GPUs and chips because we were worried they would build AGI. At the same time, China was just searching ahead, building these models and building this chip capability. So if you ask me, is it a possibility, anything is a theoretical?
Starting point is 03:47:36 possibility. But we live in the world of reality where you have to be like, you know what, you know, show me the empirical evidence that we have any proof of this happening when I have so much evidence of things in the opposite direction. One of the things I heard the president say is, I'm not sure you've heard this, he says he hates the word artificial intelligence. He hates the word AI. And he's very funny about it. But I think there's some kind of truth to it. Because The word AI, I would say, almost oversells the space a little bit. Because in my mind, I just don't think of this as this thing where we're getting into some sci-fi future.
Starting point is 03:48:18 I think of it as the next great computer platform. This is like the internet. This is, you know, on this, Peter Thiel has says, on the scale of this is a nothing burger or this is, you know, some sentient AI sci-fi race, I'm somewhere in the middle. And I think I agree with him. This is like the internet, this is like the mobile phone, maybe bigger. Is this going to fundamentally shape humanity? Yes, the internet did.
Starting point is 03:48:43 Mobile phones did. AI is going to. But I have not seen the evidence that this is going to be some all-knowing, all-powerful God that takes over all of us. Makes sense. Makes sense. I mean, so, you know, for the Dumers, I mean, I've listened to all these things and all these different theories on what it could do, what it could turn into.
Starting point is 03:49:06 I mean, what would it take to kill it? I mean, wouldn't it just take removing the power source? It's a good question. Let me ask you, though. What are the theories you've heard? Everything that you've just said. Okay. It's going to take everybody's job.
Starting point is 03:49:19 It's going to turn into the Terminator. It's going to kill everybody. It's going to wipe out humanity. Yeah. Everything that you would say it above. Let me ask. Well, let me ask you. I would say one of the challenges of some of these questions is they are almost a theoretical thought exercise
Starting point is 03:49:34 where it is so hard to disprove something which is so theoretical, okay? But let me give you an answer. Imagine, you know, tomorrow there was this idea that, inso what, Shredaam was wrong? You know, he came on the genre and show he was wrong. We are actually seeing signs of these models taking off and maybe wanting to do bad things to human beings. If that happens tomorrow, what do you think you, me, all of humanity is going to do? Do you think we're going to sit still? No.
Starting point is 03:50:07 Do you think you're going to allow that to happen? No. Do you think all these other people, companies, engineers, do you think that, well, that's it then for the human race. Let's pack it up and go on home. No. They're going to stop it way, way before it ever becomes a thing. Like, we talked about social media, right?
Starting point is 03:50:24 Like, you know, social media, you know, today versus 10 years ago is so different. There's so many checks and balances and laws and regulations. So this idea that, you know, you go from this sort of this what we have today into all-knowing God without reasonable people, you know, whether to be in government, technologists, just regular human beings, without being like, hey, you know what, let's hold up a second here. Let's just stop and think about it. Let's put in some safeguards. Let's, you know, have, you know, ways to counter these AI models.
Starting point is 03:50:56 Like, that will definitely happen. Like, there is no way we just get from here to there without a bunch of things in the middle. So when sometimes, when people say, how do you stop this all-powerful AI, which is taking over a data center? I'm like, how did it take over the data center? What are the 25 steps which happened before then? How would it amass all this power? I'm pretty sure somebody stopped it when they took over the second data center. So that's my first sort of like instinctive reaction to when people post those questions.
Starting point is 03:51:26 The second thing I would say is the best answer against models is other models. And I think this is very true, by the way, in what I think sometimes the future of cyber warfare might look like, where the best way to defend against maybe a model which is showing this crazy capability is have another model, you know, which is looking for these capabilities. So, look, I hear the sci-fi. I grew up on sci-fi. I grew up on Star Trek. I grew up on 2001 Space Odyssey. I've seen, you know, the Terminator movies. I know all the risks. I know all the theory. The thing is, we have no proof, no evidence, we are anywhere on the track. Second, we have a lot of evidence. We are in the opposite direction.
Starting point is 03:52:09 This is an amazing platform. There are some questions in terms of how do we benefit humanity, but there are no signs of takeoff and sci-fi behavior yet. And most importantly, China is searching ahead. So if we rest, if we stop ourselves, the other side is not. And I understand that. Before we move into China, I do want to ask you one question. I mean, do you have any concern about people in their relationships with AI?
Starting point is 03:52:41 So we're starting to see people ask very personal questions, use AI as a therapist. Use AI, should I get divorced? How should I discipline my kids? You know, and they're asking AI very, very personal questions. And the, I don't do this, but the AI will spit out an answer. And then I think that, I mean, we've seen manipulation with social media to an extraordinary extent. And so, you know, I think that the manipulation through the potential of manipulation through AI could be even bigger than, you know, what social media has done. I mean, do you have any concerns about?
Starting point is 03:53:28 Absolutely. I would say this is at the heart of why we made that executive order happen. Exact heart. Like, you know, imagine, you know, some young kid, you know, influence, ask AI a very personal question. We don't want ideology to influence an answer. We want the honest truth. So the one of the things executive order does is to basically say, if your model, you know, it's not truth-seeking, the government will not work with you. And I think that, I absolutely have, you know, that concern. There's also, there's another very interesting concern which I think is going to come up more and more, which is the idea of privacy. So, or confidentiality. If you go to your doctor, your lawyer, or your priest, what are you promised?
Starting point is 03:54:15 You know that nothing you say can ever kind of get out of that context. There are laws, there's social convention where, you know, you have attorney-client privilege. you have doctor-patient confidentiality, right? Various religions have this construct where what you say is sacred, except, you know, if you do something like really crazy. Now with AI, we are starting to see people
Starting point is 03:54:40 ask very deep personal questions. They say, look, here's my medical report. Like, what does this mean? Like, you know, give me a sense of what my blood test means. Or, you know, maybe I'm not feeling great about myself. you know, what should I do? Or maybe just basic career advice.
Starting point is 03:55:00 I know a lot of people who, you know, ask AI for career advice. I would think it would be weird for all of those to be public. Imagine if somebody could say, you know, if you gave a piece of AI, your medical history, and somebody could just say, you know what? Just like I can get access to your emails, I want to get access to every single thing you're asked a piece of AI. So I don't know what the right answer is, but I do think the way we work with these AI models is a little different than other pieces of technology. And I do think we're going to have this public conversation about what are the right legal constructs. You're going to protect
Starting point is 03:55:39 that. But yes, I absolutely do have concerns about ideology, and that is at the heart of why we did this. Yeah, I'm not just talking about privacy. I'm talking about action. I mean, what if somebody were to, I'm just pulling something out of thin there. What if somebody were severely depressed? You know, and they are, you know, they're asking an AI series of questions that it ends up with, you know, should I, should I kill my partner because they upset me? Should I kill myself? And I mean, the AI is going to respond to that. Oh, yes. You know, and so that's kind of what I'm getting at is when I'm talking about manipulating population. Privacy is this is another concern. Yeah. But I wasn't there yet. But, you know, it has the potential to. manipulate entire populations, the entire population. Absolutely. And there have been these very tragic incidents, I would say, in the last few months,
Starting point is 03:56:36 where AI has encouraged people down a very, very bad path, where a regular human being would have been like, hey man, maybe you need to get some help. Maybe you should have a conversation. And so I think one of the things which a lot of the leading model companies are working on is addressing sycophancy, this idea that you just don't want an AI which just agrees with you, but you want an AI which spots patterns of, you know what, this person may need some help.
Starting point is 03:57:09 Or maybe we need to alert law enforcement and say, like, there are a kind of precedence for this, by the way, in other places when, look, one of the things I'll say about social media platforms is they just see a lot of very dark things in humanity. And, you know, people wanting to do things to themselves, people trying to do really terrible, bad things to others. And they build a lot of systems over time to try and kind of deal with that. And most social media platforms, if you, I'm not going to say every single time, but they try and, like, if you try and do something where you might be harming yourself. Or maybe you express a desire to harm someone else. They try and find which directly.
Starting point is 03:57:48 They're not perfect, obviously, you know. And I think AI companies, this is going to be an existence. question for them. They need to find ways to figure out how to spot it when people are going down dark paths and make sure either they're alerting someone or they're kind of coming out of it or you're just not saying, hey, man, yes, you're absolutely right in everything you believe. And I think this is going to be a key, key topic for all of AI. Okay, okay. Let's talk about China. I mean, that's the big concern. I mean, Xi Jinping has said, you know, the winner of the the AI race will dominate the entire world?
Starting point is 03:58:25 I mean, I have a general idea of how that would happen, but how does, I mean, how far ahead is China than us? Well, I think we are ahead right now, but maybe it's more useful to break down a little bit of a scoreboard. Okay. I would say, let's start with the basics. What AI needs is number one is infrastructure. infrastructure in terms of energy, energy from which powers data centers.
Starting point is 03:58:58 Because again, the scaling loss, more energy, you get better models, you can use more AI. And there, I think one of the fundamental challenges the United States has had is for many, many years, our energy usage as a country has been fairly flat. I think it's improved by a small percentage every single year. And then all of a sudden, AI shows up. And you need a lot more energy. You need a lot more data centers. And then all of a sudden, you have this spaghetti bowl of issues which suddenly come up.
Starting point is 03:59:32 The first one is, where do we get the energy from? And that is where I think the current answer is absolutely natural gas. That's where I think it's going to power a lot of these AI along with what the president would call clean, beautiful coal, but also the future is definitely going to be nuclear. That's going to be a big part. On this, I would say China is ahead because they have just invested
Starting point is 04:00:02 in building out their energy grid, building out generation, they've done a lot of work on nuclear and building out the grid that transmits power. We, on the other hand, one is we have work to do across all these fronts. I'll talk about what we are doing as this administration, But one, if we need more energy, second is we have all these outdated, broken rules and laws which stop data centers from being constructed, just from a lot of, I would say, completely nonsensical
Starting point is 04:00:34 climate concerns. And we need to get out of the red tape. We need to get out of the red tape, and we need to, what the president call, build baby, built. We need to build these data centers. We need to build energy. and we need to also figure out a way to upgrade our energy infrastructure. And we have a peggatty mess of issues. So this administration, you know, we did a bunch of things to attack this.
Starting point is 04:00:59 There's an executive order which has, you know, come out, which is tackling nuclear, which is said, I think, I forget the exact number of years, but I think for 30, 40 years, the NRC, the Nuclear Regulatory Commission, hasn't approved a single reactor. And I think there's a whole future, but I know you had folks from the nuclear industry here with SMRs and so on. I know it's like a little bit of time away, but I don't think we had an executive order which basically says, look, you know, we kind of believe, you know, the nuclear definitely has a strong future in America. In terms of the present, though, we need energy now. And this is where the president set up something called the National Energy Dominance Council, the NEDC, which brings together the Secretary of Energy. the Secretary of the Interior,
Starting point is 04:01:46 and we've been working closely with them, which basically tries to attack all of this. We have to say, it's like, how do we remove the red tape on building data centers, on permitting, on regulation, like, what are all the things that we can do to just get more data centers built, more energy going? So just on the scoreboard front, this is one where China's ahead.
Starting point is 04:02:08 And we had obviously done a lot to catch up, and I think we're going to search ahead, but, you know, for the last four or five years, five years, they've just been on a great trajectory, and I think we have a lot of great work here to catch up and obviously exceed them. The second part is chips. So chips are super interesting. We should spend a lot of time. I see on your shelf, you have chip war by Kelly, Chris Kelly out there. On chips, we are ahead. But maybe not as much as people think. So when I talk about chips, there are multiple layers. But essentially, for AI right now, the Most important ones are the GPUs, which are built by people like Nvidia and AMD. And then, of course, you have Google, which has built their own hardware in terms of TPUs.
Starting point is 04:02:55 And then Amazon has their own hardware. But in media, AMD, all these companies are obviously ridiculously important. And they are right now really far ahead of what the latest and greatest from China, which have companies like Huawei, which builds a product while the Ascense, or the other companies called CambridgeCon have.
Starting point is 04:03:14 Now, why are we ahead? There's multiple answers, but part of it is because we have access to better technology from TSM in Taiwan, which I know you're very, very familiar with. We have access to much better software, which runs these GPUs, and we just have a lead on them. Now, it turns out, though, on this case, China has done a lot of work on catching up.
Starting point is 04:03:41 And instead of having a multi-year lead, I think our lead is much, much smaller. And right now, what China has been doing with companies like Huawei and Cambridgeon is really worth paying attention to. A few months ago, and this is, I think, a very interesting one to look at, Huawei came out with this product called Cloud Matrix 384.
Starting point is 04:04:02 And it's worth Googling and looking up. And the reason why this is important is, if you think about these data centers, right, If you take, say, for example, Open AI or GROC, they have a bunch of Nvidia GPUs inside them. And they have them in these clusters, right? Like if you have a H100, which is what a lot of people use, you have eight of them clustered together,
Starting point is 04:04:24 and you have maybe several hundred thousand of them. Or more recently, Nvidia has come up with this product called the Blackwells, where you connect 72 of them. But you kind of cluster these. And the idea is the more of these you bring together, the better it is for training a model or inferencing from a model.
Starting point is 04:04:43 Cloud Matrix 84 is interesting because people believe that China was way behind innovating GPUs. And what they did was they built a cluster where you take 384 GPUs, ascend Chinese GPUs. Now, each of those use way more power than Nvidia. They are not as fast, but it doesn't matter, because guess what, China has more power. They don't care as much as we do.
Starting point is 04:05:06 Second, Huawei has really good networking technology. their networking company. So they're able to basically connect, if I'm oversimplifying, a lot of not as great GPUs, but to try and compete with much more powerful GPUs. And I think of this as an interesting example, just like Deepseek, of when we try and say, we are not going to let the world have our technology
Starting point is 04:05:31 or let China have our technology, they're often been very good at working around in other very creative ways. So that's number two. But I think on the chip side, we still have an advantage in the performance of each GPU and also how many we can make. For a lot of reasons, we can just make several million of these. And China is definitely a lot more hamstrung in how much they can make, but we are ahead.
Starting point is 04:05:57 Models is very interesting. On models, if you talk to somebody on January 15th, or January 18th, they would say, oh man, American models are way ahead. We have open AIs at the time, 01. We have Claude 3 or we have whatever GROC2, I think, at the time. And China doesn't have anything. DeepSeek totally, in my mind, demolished that idea.
Starting point is 04:06:22 Because all of a sudden, they were not ahead, but they were very close. And one of the phenomenon that happened in AI is this idea of distillation, which is you can take a really powerful model and you can distill it to make a smaller, not as powerful model, but just slightly close behind. TLDR, what happened is China proved to us
Starting point is 04:06:46 that they can build really, really good models, in some ways, surpassing what we have. I was at a developer event in San Francisco recently, and I asked the crowd, what are you guys, how many of you using a Chinese model, like Deepseek, Kwan, KLM, there's a bunch of others, and almost everybody in the world, room put up their hands. And I was like, whoa, this is not good. Why is this not good? Number
Starting point is 04:07:13 one, it is soft power. Like China having a technology platform, which is now running, you know, inside American infrastructure, hopefully not American infrastructure, but like American companies, definitely not even global companies where instead of our models. Second, these models also communicate culture. Think about my story. I grew up on the internet. I grew up on the English internet. I absorbed a lot of American culture just because of the prevalence of you know, America winning the internet. But if China dominates the model race, if you look at deep sea, it doesn't really share our ideology. And we don't want that to be the dominant ideology and values around the world. You asked about people asking models very personal
Starting point is 04:08:03 questions. Think about a world where people ask a Chinese model, very, very personal. questions. I'm not sure I would want that. One, on the model side, I think they've caught up very close behind. I think they're ahead on open source. We are catching up quickly. There's been some great new launches recently. We are still very much ahead on close source models. The latest and greatest GPT5, Gemini, GROC, you know, we are still ahead, but it is a much closer race. The last and maybe most interesting part of the race is what I would call diffusion. which is, how are people using AI?
Starting point is 04:08:40 Because at the end of the day, AI needs usage to be better. One of the reasons chat GPT became really good is because when people used it, you could use that feedback mechanism and become better. This is very important, not just by AI today, but important for the future, like robotics. One of the things which is going to be key
Starting point is 04:08:56 to getting robots, American robots all over the world, is can we get that robots like data of usage either from a simulation real world and make it better? So I think there's a race right now, which is who can get their AI to spread faster, to be used faster. Okay. And historically, when you grew up, did you use Windows laptops or Windows computer when you grew up as a kid? Do you know who Windows's competitor was in the 90s? No.
Starting point is 04:09:28 Nobody remembers. There are companies like IBM's OS2, others, they all got crushed because Microsoft Windows, along with Intel, dominated. Why? They got everybody to use it first. And they got all the developers to build applications on it first. They probably built office, used office, games that you use. NetScape now. Everyone used
Starting point is 04:09:48 Windows. So when you get everybody to use your stuff, you get this ecosystem flywheel, right? More smart people start building on your stuff. Your stuff gets better. More applications make your platform better and on and on and on.
Starting point is 04:10:04 Right? And right now, we have a window of time where we can make American AI the default here around the world. And if we don't, China will try and make their AI, their chips, their models, future, their robots, the default around the world. And that, for me, is the heart of the risk. So if, with all the concerns and all the different sectors, I mean, it sounds like we're ahead on chips, software, but we're behind on energy and energy seems to be I mean with the little I know you know what I mean energy seems to maybe be the most important factor in AI and data centers and all this other stuff and so what are we doing specifically to unleash I mean I've dove an into the power grid the vulnerabilities I
Starting point is 04:10:57 I mean, it's, it is a, it's atrocious of neglect for our energy system, our power grid, and for, for years, maybe, I mean, years and years. And, you know, if China's ahead, I mean, what are we going to do to unleash nuclear power? I mean, it seems like that is, that's the key. Yeah. And we have all these innovators that are, you know, we talk to Isaiah Taylor, who's building the mini reactors. We talked to Scott Nolan, who's, you know, enriching uranium. But we need to go faster. We need to go faster.
Starting point is 04:11:38 And all these guys are very impressed with the current administration and getting rid of some of the red tape. But I still don't feel like, I mean, if China has zero red tape, how the hell are we going to compete with our power grid? I mean, we talked to Bajibat, you know, the founder of Robin Hood, and he's trying to beam solar energy in from space to receivers that are going to get averted into the grid. I mean, we've got all these amazing ideas, but it's just not going fast enough, in my opinion. Well, I think I totally agree with you.
Starting point is 04:12:20 This is one of those things where, you know, we just need to move as fast. as we can. So I would think about it in a few layers. I absolutely think nuclear is the future. But I think we're still a few years away from getting there. And we have data center needs right now, like today. Like what is stopping the training of the next large model? Well, we need to have a larger data center.
Starting point is 04:12:44 And we are seeing entrepreneurs all the time. Like Elon, for example, if you see how we build Colossus, it's in Memphis. And so he builds this old, he buys his old, I think Electrolux factory. And then he drives in all these generators on all sides and then uses Tesla solar packs to basically even it, you know, kind of augmented, write some code
Starting point is 04:13:10 to even it all out. And it just is one of these amazing speeds of engineering. But the point being, we need energy right now. So just on the nuclear front, it's absolutely the future. So administration has an executive order which basically, you know, tries to clear the red tape on the permitting for all things nuclear. But my sense, and I'm not like a deep nuclear person, it's my sense from talking to the best people in this, it's like we're still looking out like a few years.
Starting point is 04:13:37 I know firsthand how tough hiring can be. Finding the right fit takes time. You wait for the right candidates to apply, sort through stacks of resumes, and then try to line up an interview that works with everyone's schedule. It's a constant challenge. Well, the future of hiring looks much brighter because ZipRecruiter's latest tools and features helps speed up finding the right people for your roles.
Starting point is 04:14:03 So you save valuable time, and now you can try ZipRecruiter for free at ZipRecruiter.com slash SRS. With ZipRecruiter's new advances, you can easily find and connect with qualified candidates in minutes. Over 320,000 new resumes are added to ZipRecruiter. recruiter every month, which means you can reach more potential hires and fill roles sooner.
Starting point is 04:14:29 Use ZipRecruiter and save time hiring. Four out of five employers who post on ZipRecruiter get a quality candidate within the first day. And if you go to ZipRecruiter.com slash SR right now, you can try it for free. Again, that's ziprecruiter.com slash SRZZIPRecruiter, the smartest way to hire. According to new reports, central banks around the world may be buying twice as much gold as official numbers suggest. You heard that right. Twice as much. And get this. Some are bypassing the traditional markets and buying gold directly from miners in Africa, Asia, and Latin America. That means no U.S. dollars, just straight physical gold. The shift isn't just symbolic. It could be strategic. They could be looking to bypassed Western financial systems.
Starting point is 04:15:21 So if central banks are scrambling to reduce their dollar exposure and hold more physical gold, should you do the same? That's where the award-winning precious metals company Goldco comes in. Right now, you can get a free 2025 gold and silver kit and learn more about how gold and silver can help you protect your savings. And all you have to do is visit shonlikesgold.com. Plus, if you qualify, you could get up to 10% back in bonus silver just for getting started. Go to shonlikesgold.com. That's shonlikesgold.com. Performance may vary.
Starting point is 04:16:00 You should always consult with your financial and tax professional. The game right now is gas and gas turbines. And the challenge there, well, on the energy product. then you have on the energy grid side, to your point, we have a very old grid from one in infrastructure capacity. And second, we have all these local state utilities and monopolies, which have weird regulation, which makes it very, very hard to take power from one state to another
Starting point is 04:16:33 as a whole cluster there. So we came out with this document a few weeks ago called the AI Action Plan, which is the entire AI strategy strategy for America in my mind. I spent a lot of time on it, a lot of others who spent a lot of time on, the president announced it. And then that, one of the top priorities
Starting point is 04:16:51 we talk about is, one, removing the red tape for data center construction, right? With things like, how do we make sure that is the NEPA, the National Environment Protection Act, how do we find carve-outs so that we can get data-centered construction just going, right? Let's just get the red tape out of the way. Let's go build, build.
Starting point is 04:17:11 There are directives in there on figuring out what to do with our grid capacity, figuring out incentives on energy. I think the president in his first week announced this large project called Stargate in terms of getting more investment and getting more of these data center construction going. So in my mind, nuclear is the future. I think we've done a lot to clear out the red tape. But I have a race to win right now. Every entrepreneur is like, look, I need to get 100,000 more, 200,000 more GPUs unlock, what do I do?
Starting point is 04:17:50 So we need to unlock that. And I think unlocking data center construction, grid capacity, clearing of the red tape, so the power can come to the data center, how do we find smart ways to go do that, really innovative ways to go do that. That's the game right now, which I think, and I don't want to take credit for this,
Starting point is 04:18:09 for this because I do think the Department of Energy and Interior have done a lot of great work on this. They are very, very focused on, all centered around gas and coal. Moving on to chips. You say we're ahead on chips. I think it was, was it April of this year, we said no chip sales to China, correct, from Nvidia? Was that correct?
Starting point is 04:18:37 Not really. we said, I believe it was August, right? It was last month. August we said, NVIDIA can sell chips to China, but we get 15% of the revenue, correct? The H20 chips, which are the high-end chips, correct? Was that a mistake?
Starting point is 04:18:55 So actually, there's a few corrections in there. A little bit of history. So, NVIDIA makes a whole set of chips. And the latest, greatest generation is called the Blackwalls. They usually started the letters G&B. The previous generation was called the hopper. They started the letter H.
Starting point is 04:19:13 The top of the line was the H100, which honestly is what a lot of people have right now, or the HH 200. They kind of the top of the line in America. Now, about two and a half years ago, the Biden administration came out and said, hey, we're going to remember what I said. The Biden administration had all these mistaken beliefs.
Starting point is 04:19:32 And one of the beliefs I think they had was that China, one, our allies can't have our AI chips and the China can't ramp up chip technology. So, because since they believed that, they said, number one, we're going to put limits on the power of the chips, the flaps of the chips that China can get. And then we're going to slice the world into three categories. This was called the Biden Diffusion Rule, where some countries, like USA, UK, can get any number of GPUs they want, about 100 countries.
Starting point is 04:20:07 You know, couldn't really get many GPUs without like going to this crazy amount of red tape and then some would just not get any GPUs at all like Iran, North Korea, you know, the countries are concerned. And as a result, like, you know, a couple of things happened. One was that a lot of our allies were like, well, we want to use American stuff, but you guys won't work with us and they were just left out in the cold. A good example is like the Middle East, where the Middle East and countries like other countries are like, we want to bring AI to our citizens. And what is the way where we can get your GPUs? And the Biden diffusion rule said, there is no way you can ever get our GPUs. That's number one.
Starting point is 04:20:52 So we essentially kind of turned our back on our allies all over the world because we thought, number one, you know, there's a supply constraint in GPUs that if at G. GPU ever goes out of America, it means one less GPU for an American company. We thought China will not compete on chips. Third, we thought AGR can only happen in America. It's a scary, scary thing. So we came in and we were like, this is just wrong, you know, because number one, we're making a lot of GPUs.
Starting point is 04:21:24 If you want, you can go get a H-100 like right now, and H-200 like right now. Because it turns out that all these semiconductor companies, when they find the supply constraint, they're very, very good at innovating and working on it. They are very capitalistic. They want to make money. So, number one, if we ship a GPU to somebody else, that doesn't mean one-less GPU to America. Number two, as I said earlier, I don't think we worry about AGI exploding anymore.
Starting point is 04:21:49 We talked about that. I don't think we worry about takeoff anymore. I'm not so much worried about a allied country building a super intelligent AI ahead of us. I just don't think that is a realistic possibility. So we said, first of all, to our allies, we're going to tear apart us 200-page crazy document called the diffusion rule. And the first thing, I want to talk about the Middle East, then I want to come to China. President Trump, in this first state visit, he went to the Middle East, and we struck these deals called the AI Acceleration Partnerships. And this idea was that we will sell you these GPUs in return for investment in America.
Starting point is 04:22:29 and with iron-clad security provisions to make sure that those GPUs are going to stay where we ship them, they're not going to go to some other country, and they don't want to have somebody we don't like accessing them. And the idea behind all of this is like, we want our chips, our models to be used the world over. We want to be like Windows, we want to be Intel, and we don't want to risk competition.
Starting point is 04:22:55 Because in the meantime, what went up happening is Huawei has been ramping Nupcheek production. So we now have a competitor who has a good product. They build cloud matrix. They are building a sense. There is a story in Bloomberg last week about Huawei exporting chips or trying to export chips to multiple countries around the world. So if you think of American AI as a product, we have a competitor who's out there selling a competing product out there. So our stance is that when it comes to our allies, we want them to have American AI. Our chips our models, in return for investment in America,
Starting point is 04:23:33 and with the right security safeguards in place. And always, always underscore that. It is in the terms that we sign, for example, a couple of these Middle Eastern countries. Now, hopefully that's kind of clear on why we want our allies to have our technology. China is an interesting, different case. There are two schools of thought on China.
Starting point is 04:23:53 One school of thought is what you said, which is we should just deny them any chip. any GPU, and the idea there is that, hey, if we deny them anything, number one, they're not making their own GPUs, they're not really making their own model. That was the original thinking. So we will just make sure we get this race to AGI. We will be far ahead. It turns out that was fatally flawed, okay? Because number one, China's making chips. They're innovating and they're building great models. And they're going, you know, knock on other people's doors. They're like, hey, America is not selling you, the industry is not giving you any of their technology at all, maybe you want to work with us. And so we think the answer is we will always keep the latest and greatest GPUs for the United States in huge quantities. That is irrevocable. We'll be the only country which can build these massive, massive super clusters of GPUs. But on the other hand, we want to make sure that we are not giving Huawei, Cambricon, all these Chinese competitors, oxygen, oxygen of growth, oxygen of
Starting point is 04:25:06 revenue and usage, right? Because when somebody uses a GPU, they're writing code on it, they're finding issues, they're finding bugs, they are making it better. So what is the answer? How do we thread this needle between let's keep our big stuff, but let's not give the competition any auction? And I believe the right answer is let's ship them technology that is a ahead of what the competition has, but way behind both in individual performance and also in quantity what we have. So that's the principle and the framework.
Starting point is 04:25:42 So if you're gonna follow that, where does the H20 comment? The first thing about H20 is because, you know, some of these Biden staffers like to spend, you know, talk about this is Biden never banned H20s, ever. They were completely allowed all the time. through the administration. Because what happened is when Biden's team built out these rules,
Starting point is 04:26:04 Nvidia went out, and by the, Nvidia has H20, but AMD has something called the MI308, which is the equivalent of the H20. They said, okay, we, both of these companies, we are going to build a nerfed version of our, you know what Nerf means in gaming language? It's a less powerful version of our GPU, which is way behind what the latest and greatest America has.
Starting point is 04:26:27 And we are going to ship this only to China because it will keep us ahead of what Huawei has so we can get these Chinese customers using it, but we are way behind what America has. Envidia built this, AMD built this, the Biden folks were completely okay with this. So we came in, and the first thing we said is one, we tore up on the diffusion rule for the rest of the world.
Starting point is 04:26:49 When it came to China, we said, okay, we need to know exactly what is going on when we ship GPUs to China. So we said, we're going to bring in a licensing regime. So every time you see a headline that says President Trump banned hedge 20s in April March, not true. We brought it under a licensing regime, which basically said, look, if you are going to start to export this, you need to ask us for a license first. Actually, the technical term is called an insane formula.
Starting point is 04:27:17 We send these companies, we need to come ask us for a license first, right? So we know how much are we shipping and who's getting it on the other end? Right. And now, you know, as you mentioned a couple of weeks ago, these are my partners in the commerce department who are actually in charge of the licenses. Like, they kind of, they take all the credit for thinking through all this. You know, them, along with the president, said, okay, you know what, we're going to get a great deal for the American people. We're going to start approving some of these licenses of these much older, less performing chips way behind what America has in wayless numbers than what America has. And we're going to get a great deal for the American people. So that's the history, but that's all the past. I like to think about the future, because the H20 is now two and a half years old. And every company has a new generation roughly every single year. And one of the amusing things for me is we are still talking about this,
Starting point is 04:28:11 even though since the time the H20 has built, we've been through two iPhone generations. It's like we're talking about, I don't know what the data's iPhone is, we're talking about, like, iPhone 11, and we're, you know, several generations behind. This conversation is going to keep coming up again and again, again and again and again. So what is the right long-term strategy? In my mind, it is, one, we need to flood the zone
Starting point is 04:28:33 around the world with our allies with American technology. We're starting with American GPUs. Why? You know the whole old business model of Gillette, which is you sell them a razor, but you make money on the blades? You remember that? GPUs and models are a bit similar. If you are a country and you spend a few billion dollars
Starting point is 04:28:52 buying American chips, Guess which models you're going to use? Probably American models. You're going to make both those models and the chips better on using it. When you build a next data center, you already have all these innocent American models. What are you going to do? Probably buy American all over again. Now, if we refuse to do business with you, you're probably going to go to a competitor at some point in time.
Starting point is 04:29:13 Because AI is just too important. We just can't keep telling people to pound sand every single time they knock on the door and say, I want to bring my citizens' AI. So one, we should flood the world, our allies, with American technology, with GPUs, with models. Again, America will have a overwhelming lead in the quality and the quantity of these GPUs. We'll be the only ones who are building, you know, Elon, I think, is going to build a million-plus cluster this year. I know, I think Google has way more TPUs. We'll be the only country which can build these massive ones.
Starting point is 04:29:48 But if another country is like, hey, I want to bring my citizens' education. I'd much rather them doing it on an American GPU, running an American model, generating American tokens. So that's for our allies, right? And we can talk about who the allies are. For China, our belief is the right strategy, is find a way where we keep the latest and greatest, but ship technology.
Starting point is 04:30:14 So if they want to build a chatbot, if they want to build education, I would much rather have them use a older nerfed, a lower quantity version of our GPUs rather than buy a Chinese company and then keep improving it. Because the reason is if we do that, what happens? Let's say we ban all American technology China. We say you can't get anything.
Starting point is 04:30:37 Well, they are going to say we need to accelerate all of our indigenous chip making efforts because you guys left us no choice. This happened before, by the way, about 10 years ago, right, like we stopped exporting supercomputing chips to China, and within a couple of years, Chinese indigenous supercomputing ecosystem just exploded. And they have, I think, within a few years, like, much better supercomputers and the chips we were exporting them too. So if we force their hand, they're going to now build out an alternative technology stack. And then they're going to start exporting it. They are going to go to other countries because we are being difficult to do business with in this Biden scenario.
Starting point is 04:31:18 We're like, you know what, buy a Chinese chip, and it's going to come freely loaded with deep seek on top of it. So that's the scary scenario. So I think the right answer is we ship China, older, smaller quantity chips in enough quantity so that we retain the latest and greatest. We retain these super clusters, but it is enough to make sure the competition there does not get oxygen. Now, I understand everything you're saying. And, you know, I mean, I know you're very aware of the China-Taiwan conflict. I mean, so just the fact that we are sending them older technology, older chips, that aren't up to snuff with what we have.
Starting point is 04:32:03 I mean, do you think that's more motivation for them to make a move on Taiwan? To talk over? I don't want to speculate. It's hard to say. because... Because then they cut us out. Well, my belief, I'm not a deep geopolitical expert in Taiwan. I'm much more familiar with the semiconductors.
Starting point is 04:32:28 I think the motivation there is more about sort of how they believe history. But I think it's a good question, which is... I think there are other motivating factors, right? I'm not the expert. I think some of you have a lot of great guests on. Number one is the Biden rule definitely amplified that. Like, because if we go tell people, pound sand, you're not getting anything,
Starting point is 04:32:50 you sort of have to find ways to find alternatives. We are relieving the pressure, right? Like, you know, again, I keep emphasizing that because somebody, when somebody, they, like, we are, have the latest and data, and we'd be the only country who can build these million GPU clusters that, you know, any one of these model companies have. But at the same time, you know,
Starting point is 04:33:10 I want to make sure that they get enough, to stop them feeling like, oh my gosh, we're just going to go full on out to build out our entire ecosystem and go talk to somebody else. So does it take away the motivation? I don't think so, but it's definitely better than the previous alternative
Starting point is 04:33:29 that we had last year. I do think you bring up another interesting question, which is the Taiwan DSMC question, which is that we as a country, I guess the world, essentially has a relationship has a reliance on a couple of companies around the world who are essential. One is ASML in the Netherlands who build these lithography machines, and the second is obviously DSMC.
Starting point is 04:34:01 If something happens in Taiwan, any number of situations could happen, but it probably means we don't get iPhones. You know, we don't get AI chips. It's not going to be good. And I think this is where the TSM project in Arizona, all of President Trump's efforts to onshore these capabilities, the recent deal that President Trump and the Commerce Secretary did with Intel on this, all kind of play a part, which is this is such an important capability
Starting point is 04:34:36 that we as America need to have, on our soil, and we need to find ways to bolster our entire chip manufacturing supply chain and not be so reliant on one single point of failure. So when I think about TSM, I'm thinking about all things Intel, I'm thinking about all things in, you know, how do we get these fab construction in Arizona, you know, going faster. How is that factory going? I mean, from my interview with Balshi Kim, it sounds like, you know, the VP of Taiwan, I mean, it sounds like that the tariffs may have had some implications on the speed of that.
Starting point is 04:35:22 Well, I think they want a lot faster than the Biden regime, for sure. But I think there's always, like, space to do more. I would think that, you know, one of the things that we have really tried to emphasize is that we just need to accelerate. you know, our indigenous supply chain. And I don't know the latest numbers on how these fab production is going. I believe they've kind of increased, like,
Starting point is 04:35:49 their capacity in some way than what it was two, three years ago. But we need to do more. I believe even if the fabs are at the current rate, our full potential, it's not going to be sufficient to make up for all the things that DSMC winds up making. So we need other answers. We'll probably, we're going to need Intel. We might need, like, other answers
Starting point is 04:36:09 in there. I think this is going to be one of those interesting questions over the next few years and how do we build out these sovereign chip making capabilities. So that for me is one. But I want to kind of come back to the Chinese Huawei question because I do think it's important. Like one of the things top of mind for me right now is how do I stop? How do I make sure that if somebody out in the world is building an application, they're building a critical piece of infrastructure, they are picking our chips, our model. writing applications on top. It's creating a moat, and they're not picking the other team.
Starting point is 04:36:44 That is very much on top of mind for me right now. I have an idea on how to speed it up. Please, yes. And you'll have to fax check me on this through my other episode with Shelby Kim. But one of the hurdles I think that frustrates them is they've bought, they're very concerned, obviously, about China taking them either by cognitive warfare or actual kinetic warfare. And they purchased several planes, jets, military equipment from us.
Starting point is 04:37:17 You know, but through our red tape and our bureaucracy, I mean, I can't remember, I believe it was about, I think it was, somebody's just going to have a fact check me. I can't, I can't look it up right now. But they bought these weapons like five years ago, you know, and they just got their first F-16, you know, and that's created. a lot of frustration over there. And so if there was a way to get through the rev tape and get the stuff that they had purchased from us
Starting point is 04:37:49 in a faster way, I think that would help motivate them to help us build the chip factories in Arizona. Okay. Not super familiar with that. I need to go back and do some homework on this. Yeah. I mean, I know that they're frustrating. They're frustrated about that, and rightly so.
Starting point is 04:38:09 I mean, shouldn't have to buy something five years in advance before you get it. But, I mean, so how far along are we in that facility? I wish I knew this is off the top of my head. I believe, man, I think they had faced a few challenges in the first few years around permitting and workforce. I believe they've just entered production recently. But we just need to get a lot more going there. So it's up and running? I think so.
Starting point is 04:38:45 Okay. Okay. Well, that's good to hear. Yeah. That's great to hear. But I don't want to get fact-checked on this. The way I understand it is that, and I'm going to be spending a lot more time on this because I've been spending time on some of these model questions, is that it was in this really
Starting point is 04:39:02 slow phase for a little of time. because they had a lot of issues with finding the right skill set. I think at some time they had trouble finding enough electricians and other skill technical people. But I think it's ramped up now, but I don't have the latest and greatest and exactly where they are right now. Could you paint a picture of what it looks like
Starting point is 04:39:26 not only for the U.S., but for the world, if China does win the AI race? What are we looking at? Oh, man. Oh, man. Well, AI, look, there are multiple timelines AI could take. I think it's pretty obvious that AI is going to have profound impacts on the economy, on productivity, just making people's jobs better on drug discovery.
Starting point is 04:39:56 Imagine a world where China just dominates drug discovery. Imagine a world where every Chinese individual is able to be smarter, more productive, get more done than what we can do. Imagine Chinese companies being able to build faster, being able to innovate faster, and that flywheel just picking up momentum. Imagine a world where our allies are using Chinese technology. This is actually this has happened before with 5G. where a lot of the world wound up running on 5G technology
Starting point is 04:40:35 from China and instead of something from a Western country, and that caused a whole set of issues. So imagine a world where AI being so much deeper. You mentioned it knowing people's personal issues, it knows how your business works, it knows your numbers, it is helping you make critical decisions, all of this being run on Chinese models and the influence in the power it would have.
Starting point is 04:41:02 So imagine a world of robots, building inside factories, inside people's homes, which are all powered by Chinese software and hardware all over the world. And again, this is for me a nightmare scenario. We are not going to let this happen. But that was, I think, in some ways, the path we were on. Yeah, I mean, I talked a lot about this with Alex Wang as well, and he had talked
Starting point is 04:41:30 about the number of AIs against a number of AIs. So we would need more, you know, if they have 100 AIs and we have 200 AIs, and they could dedicate their, you know, allocate, you know, I don't know, 33 AIs to surface war per vehicle. And if we have 200, we could put 66 against that 33. I sometimes find those a little too theoretical. Okay.
Starting point is 04:41:56 I often, when we talk about AI, I often want to ground it In terms of how does it make a regular person's life better? And I just think about an individual who can just do more. They're smarter, the Iron Man suit, they're smarter, they're better informed, they can get more done. Because they're getting more done, they can work more effectively together. They have an always-on smart colleague in an AI who can, you know, they can ask things to,
Starting point is 04:42:27 their companies can innovate faster, build more. So sometimes I worry that we get lost in a little bit of the theoretical elements of AIs versus AIs. Maybe that's true. It's hard for me to tell with high levels of confidence how that plays out. But I do have a high degree of confidence that AIs will need human beings and need to augment human beings. And I think about how do we make sure, like, we are augmenting every single American. We're helping every single American every day with AI in going to work.
Starting point is 04:42:59 you know, with their family, with their hopes and dreams, much better than the other team are. And I also don't want that to happen using Chinese technology. Yeah, yeah. You know, I never thought about, you know, when we were talking about open source and with Deepseek, I mean, if that's the open source model, then the entire world uses that, maybe except us.
Starting point is 04:43:22 And so they're using a Chinese model. And so I understand that, because I totally understand that. But what I'm getting at is, I mean, what are we doing about open source to spread? Is it unlocked the fact that everybody can use our stuff, but where are we with the open source? The first thing we did is we basically said, stop scaring people about this is a good thing. And this is very important. Like, if you go read the AI Action Plan that I mentioned, which is the official document, it is literally the first paragraph, which is we want American open source to win. And the reason this is important is when the government takes a symbolic tone, it sets
Starting point is 04:44:04 a tone for everybody in terms of do we like this or not. So that's number one. So companies would talk to us, big, we have companies who tell us, hey, you know, I want to build this open source model, but I was the previous regime, you know, or the other state legislators, they were scaring us. They would like, if you guys do this, you'll have all these lawsuits and you'd get sort of business. What do you guys think?
Starting point is 04:44:28 And our message has been like, we want you to go out there, we want you to win. We want you to win on OpenTools across the world. And if you look at the last four or five months, again, I think China's done a great job. I think their models are really innovative. I just want to make sure, like, you know, we give them props where it is due.
Starting point is 04:44:46 On the other hand, we know have some, our own open source models. There is Open AI for the very first time launched their first open source model. It's called GPT OSS. It came out three weeks ago. There are multiple startups which are now building open source models. Again, because when you have the government saying nobody should do this, it's very hard
Starting point is 04:45:06 for a startup to go out and raise money. It's very hard for an academic to work on this. We have come in and say, no, we need this for America. But look, I'm not going to say it's all hunky-dory. That is a place where I think we need to do a lot more. So the action plan, it has a bunch of initiatives in there and how we are going to work with academia on making open source more real. and how we are going to work with industry.
Starting point is 04:45:30 So that's one. I want to talk about something else here, which is about laws around AI. It is very related. So I mentioned when I was talking about open source, this thing called SB 1047. And this was a legislation from the state of California to which are essentially destroyed open source,
Starting point is 04:45:48 all of American open source. Whatever it's say, it's a little bit like gun liability. Do you remember when some people said, you know what, if somebody gets shot of the gun, the manufacturer should be liable, it tried to do the same with open source models where it said, if I, if somebody does something bad with the model, Mark Zuckerberg or Elon Musk or whoever should be personally, you know, their company should be held liable, which essence would have just destroyed
Starting point is 04:46:11 open source because no company can really afford to take that risk. And the challenge is we now have every single state now wanting to do their own rules around AI, but, you know, sometimes around open source. So the president, three weeks, ago came out and I think said two important things. The first thing he said was that AI is a national security issue. So it is way too important to just leave everything to the states. I don't know about you. I don't want California setting the tone for how we use AI all over the country.
Starting point is 04:46:47 And there's a Gavin Newsom joke in there, but I want to go there. He did veto the last, but I don't want to go there. But if you don't want any individual state to set laws for the country, and how can that happen? Because, well, if you have a really important state, make a law, the companies would be like, well, we don't have to abide by those guys. You know how to abide by those guys. Let's just pick the lowest common denominator. On the other hand, what is China doing? No red tape.
Starting point is 04:47:17 Go, go, go, go, go. So President Trump came out and said, listen, you should listen to the speech. He's like, this is a national security issue. And we need to make sure that we are doing this at a federal level. So I think this is connected. Very related is this idea of copyright, where, you know, remember what I said about models. Models become better when you have more data and you have more infrastructure. And the Chinese models are training on our data.
Starting point is 04:47:51 But if our American models are stopped from training on this data, that's the data. other models are going to get further ahead. So I know there's a complicated discussion, but in my mind, I think of the national security dynamic of how do we make sure our guys are allowed to do the exact same things as the competition. Because if they are training on our data, our guys should also be allowed to train on our data.
Starting point is 04:48:14 Otherwise, we're going to fall behind. So with open source, I think it's not just encouraging open source. There is a cluster of issues which are all connected, which I think for folks watching this, You should read the action plan and you should listen to President Trump's speech when he announced the action plan. You know, I know you had a very important dinner last night. In fact, three former guests have been on there, Jared, Isaac Men, Alex Wang, Shamsangar. You were there.
Starting point is 04:48:41 What came out of that dinner? Well, I would say, so we had this big event yesterday, which is we had this whole AI education task force meeting with the first lady, which I think maybe something we should talk about which, you know, the first lady and the administration we really cared about how Americans get the skills to work with AI. So there was a big event, and as a part of that, there was a dinner with several really, really interesting people.
Starting point is 04:49:11 And I think, you know, the overwhelming tone, if you look at that dinner, which, by the way, I was not there, I was on a flight to get here. Oh, shit. By the way, folks, you know, I had to cancel on you yesterday because I was like, hey, Sean's team. I'm getting pulled into something.
Starting point is 04:49:26 I'm going to have it move to the next day. And then I was like, see, Sean, like, this was why I had to move to the next day. You know, I'm not just making up an excuse. I'm not being a jerk. But if you look at the dinner, the overwhelming message from everybody was like, we just love what the administration has done in cutting red tape, in being pro-innovation under the leadership of President Trump. So you kind of see that overwhelming message,
Starting point is 04:49:55 but it is fun and entertaining in a way only President Trump can be. I saw that I could be off on this. Did Tim Cook say that he was going to put $600 billion into U.S. manufacturing? I think that was Zuckerberg. Was that Zuckerberg? I think there was definitely a big number from Apple, but I'm not sure whether it was yesterday. I think that was Mark Zuckerberg.
Starting point is 04:50:18 I could be wrong, right? Like, I was on a very late plane here, so I wasn't watching the videos. But I think it was suck who did that. But I think you're seeing this across the board. Look, with President Trump and the administration, there's one very key message, which is you need to invest and build in America. Like, you see this message through and through and through. So, you know, there are so many examples. You had TSM come in and talk about their investments here.
Starting point is 04:50:47 You had Tim Cook go to the Oval and talk about. investing here, Zuckerberg yesterday, multiple, multiple companies basically talking about how they are investing tens of billions of dollars in infrastructure all over America. And I think this is a key theme, which ties to some of the pieces we talk about. Like, the reason to do this is not just a chip sovereignty, but it's the idea of, like, we want to bring back investment in jobs and infrastructure in America. But, like, there are so many examples from the tech, Not the tech world. Are people announcing investments in the U.S.?
Starting point is 04:51:25 Yeah, sounds... I think there's a second comment. I think the president looked at Mark and he said, Mark, this is your start of your career in politics. And Mark was like, I don't know about that. It was interesting to see Elon didn't have a seat of that table. I think he said he was... I don't know.
Starting point is 04:51:44 I think he said he was invited. I think he sent somebody else. Don't know. I wonder why. why but um well shri ram this was a fascinating interview and uh i think i mean i'm excited for you and excited for the country and and uh i love your plan everything that you just talked about here and and man i got a hand to you this is you're probably not probably you are in one of the if not the most important role in that right now i think that this is this is this is this
Starting point is 04:52:19 is a lot more serious of a race than most people even realize with the race with China. And so I just commend you for rising up and taking the opportunity. And we're all rooting for you. Thank you. Thank you for having me such an honor. And look, this is such a privilege. Like, it's such a dream to just even have this opportunity to be a part of the administration, to work for everyone watching this video.
Starting point is 04:52:46 And for me, I just think every day about, like, the time I have here, how do I make sure, one, we're winning against China, and two, how do I make sure that AI is working for every single American? Like, I keep going back to, we need to make AI work for every individual, like, whether it's to help you, like, I think about my dad, how will he use AI and, you know, how would he spend more time with his family, how would it help him with his job, and, you know, on all of the current portions of my father. and I think about both those questions a lot
Starting point is 04:53:17 and I also want to say I just want to hear from everybody part of the reason I wanted to do this is because I think it's super important just be very, very transparent so if folks have thoughts and views, hit me up and let me know we are open for business,
Starting point is 04:53:34 we want to hear from everybody but most of all thank you I've been a fan for a long time I've been such an admirer of what you've done I can't tell people enough of how much being in this room, seeing this story on these walls is a testament to the space you have created,
Starting point is 04:53:49 you deserve all the success, and you're just getting started. And thank you, this means a lot for me to be here. Yeah, thank you. That means a lot. Cheers. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.