TBPN Live - Elon's Trillion Dollar Pay Package, Breaking Down the State of AI | Katherine Boyle, Mikey Shulman, Immad Akhund, Jordan Castro

Episode Date: November 7, 2025

(00:17) - Timeline in Turmoil over OpenAI (09:36) - Fannie & Freddie May Buy Tech Stakes (11:27) - The Mansion Section (18:23) - Elon's Trillion Dollar Pay Package (28:06) - Breaking ...Down the State of AI (55:55) - 𝕏 Timeline Reactions (01:11:07) - Katherine Boyle, a general partner at Andreessen Horowitz leading its American Dynamism fund, discusses the transformative speech by Secretary of Defense Pete Hegseth, which outlines significant reforms in defense acquisitions. These reforms prioritize commercial-first technologies, streamline the procurement process, and encourage competition from startups, aiming to modernize the military and enhance national security. Boyle emphasizes the positive impact these changes will have on Silicon Valley and the broader defense innovation base, marking a pivotal shift in defense technology integration. (01:29:44) - Mikey Shulman, co-founder and CEO of Suno, an AI-driven music creation platform, discusses the company's mission to democratize music production by enabling users to generate songs through text prompts, eliminating the need for traditional instruments or production software. He highlights Suno's evolution from a Discord bot to a web application, emphasizing the shift from a tech-savvy early user base to a broader audience seeking creative entertainment. Shulman also addresses the integration of AI in music, noting that while professionals increasingly adopt Suno to enhance their workflows, the platform primarily serves individuals who love music but lack formal training, fostering a new behavior of accessible music creation. (01:56:49) - Immad Akhund, co-founder and CEO of Mercury, a fintech company offering banking services tailored for startups, discusses Mercury's recent milestone of achieving three years of profitability, emphasizing the importance of building trust with customers by ensuring financial stability. He highlights the company's proactive measures, such as providing accelerated FDIC insurance to safeguard client deposits, especially in light of the Silicon Valley Bank crisis. Additionally, Akhund shares insights into Mercury's strategic use of AI to enhance back-office operations and customer service, underscoring the company's commitment to innovation and efficiency. (02:14:20) - 𝕏 Timeline Reactions (02:36:51) - Jordan Castro is an American novelist and poet known for his works "The Novelist" and "Muscle Man." In the conversation, Castro discusses his novel "Muscle Man," which follows Harold, a discontented literature professor who finds solace in weightlifting, exploring themes of masculinity and academia. He also reflects on his personal experiences with fitness, the impact of physical activity on mental health, and critiques the media's portrayal of masculinity. TBPN.com is made possible by: Ramp - https://ramp.comFigma - https://figma.comVanta - https://vanta.comLinear - https://linear.appEight Sleep - https://eightsleep.com/tbpnWander - https://wander.com/tbpnPublic - https://public.comAdQuick - https://adquick.comBezel - https://getbezel.com Numeral - https://www.numeralhq.comPolymarket - https://polymarket.comAttio - https://attio.com/tbpnFin - https://fin.ai/tbpnGraphite - https://graphite.devRestream - https://restream.ioProfound - https://tryprofound.comJulius AI - https://julius.aiturbopuffer - https://turbopuffer.comfal - https://fal.aiPrivy - https://www.privy.ioCognition - https://cognition.aiGemini - https://gemini.google.comFollow TBPN: https://TBPN.comhttps://x.com/tbpnhttps://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235https://www.youtube.com/@TBPNLive

Transcript
Discussion (0)
Starting point is 00:00:00 You're watching TBPN. Today is Friday, November 7th, 2025. We are live from the TBPN Ultradome, the Temple of Technology, the Fortress of Finance, the Capital of Capital. Time is money to say both. These use corporate cards, bill payments, accounting, and a whole lot more all in one place. Ramp.com, baby. The timeline is in turmoil over Sam Altman again. What were we calling it, backstop gate? Backstop gate continues unabated. John, the timelines. And turmoil over Sam Altman. Every single day this for the first time. Every single day this week and last week, it is nonstop open AI, what will happen.
Starting point is 00:00:40 Is it over? Is it the end of the global economy or will we, you know, live to fight another day? The main point of debate is over, you know, Sarah Friar's comments that she used the word backstop. She backtracked on her backstop comment. and said, I wasn't asking for a backstop of open AI equity. I was advocating for this American manufacturing plan. Then Simfer Satoshi is going pretty hard.
Starting point is 00:01:12 He says, here's an Open AI document submitted one week ago where they advocate for including data center spend within the American manufacturing umbrella. They specifically advocate for federal loan guarantees, and Simfer Satoshi says, Sam lied to everyone. Let's read the specifics. Yes. The specifics are AI server production and AI data centers. Broadening coverage of the AMIC, which is the American Advanced Manufacturing Investment Credit, will lower the effective cost of capital, de-risk early investment, and unlock private capital to help alleviate bottlenecks and accelerate the AI build in the United States. Counter the PRC by de-risking U.S. manufacturing expansion. To provide manufacturers with the certainty in capital, they need to scale production quickly.
Starting point is 00:01:57 the federal government should also deploy grants, cost-sharing agreements, loans, or loan guarantees to expand industrial base capacity and resilience. Okay, so loan guarantees. So what's happening here is Open AI a week ago, and everyone can go and read this letter, which is publicly available on their website, was making the recommendation that the government should effectively treat data centers or AI or token factories, put them in the manufacturing bucket, which would qualify them for similar incentives that traditional manufacturing,
Starting point is 00:02:42 defense tech, et cetera. And I don't have a problem with asking the government for a handout. I think that that's actually like best practice. It's actually in your shareholder's responsibility. Like you have a fiduciary duty to ask the government for as much help as possible. I think that everyone should go to the government. right now. Hey, if I'm paying 50% tax, how about we
Starting point is 00:03:01 take that down to 20%? How about we take it down to 0%? Like, you have every incentive to ask your person in Congress, your senator, the people in Washington to do everything they can to support your mission. This has worked out in the past with Elon and Tesla. It
Starting point is 00:03:17 didn't work out in the case of Cylindra. But like, the game on the field is if there are taxpayer dollars that are moving around the board, you want to get those into the industries that are aligned with you. And so the thing that people are taking issue with is that in the opening of his message yesterday, he said, we do not want, we do not have or want government guarantees for open
Starting point is 00:03:38 AI data centers. Yes. And that seems to conflict with the message, the letter that they wrote a week ago that is still up on their website. Yes. So if it's, if it's, what is it, what is it? Loan guarantees to expand in Israel-based capacity. The X-Chat is taking the initiative.
Starting point is 00:04:00 Just sent the admin a safe to sign. On-cap note. You literally can do this now. There's a sovereign wealth fund. Like, it sounds crazy. But, like, they're ripping checks. Like, this is the new economy. And, you know, you can be upset about it.
Starting point is 00:04:18 But you also have to understand, like, what is the game on the field? You can always advocate for, like, we should change the game. Like, we shouldn't be doing this. Like, I would prefer a more of a free market economy. But in the world where we're not in a free market economy, you want to have your company win, right? That's just rational. That's just actually playing the game on the field. Now, it is weird optics to talk about the game on the field.
Starting point is 00:04:43 That's something most people don't like doing. And that's very odd. Because when you say, oh, yeah, this is a one hand washes the other situation. Or, oh, yeah, this is a situation where, um, this is a situation where, um, you know, a backstop will allow us to be more aggressive. That feels like the banker saying, oh, yeah, I knew that the government was going to bail us out in 08. So I was intentionally underwriting loans that where it was somebody's fifth house and I knew that they couldn't pay it. I wasn't asking about their job.
Starting point is 00:05:11 I wasn't asking about their income. I wasn't asking about their assets. And so they, I pushed it way further and I made a lot of money and I got out at the top. That's what's really upsetting to Americans because the bailout comes in, the backstop comes in. Yes, it makes sense to rationally do the backstop in that moment.
Starting point is 00:05:29 But if some people get out early and then other people get cooked, that's really bad optics. That's really, really bad optics. And which is why I kind of expect a lot more of the narrative and to shift towards subsidizing and incentivizing,
Starting point is 00:05:47 like bringing new energy on time. You were talking about that yesterday. Which directly benefits the labs, and anybody building a data center. But it also feels very much in America's interests broadly, right? And it benefits, it theoretically would benefit the average American, too. We were listening to Ben Thompson this morning. It wasn't on Restream, one live stream 30-plus destinations,
Starting point is 00:06:08 multi-stream, reach your audience, wherever they are. We were listening to the Stratory RSS feed. Hopefully, Ben, goes live at some point in re-stream. That'd be awesome. By the way, I think the chat has discovered that, yes, this is a turbopower. this is a one of two i sent the other one to simon oh that's amazing but you can see it looks great especially with the green background the color gray the production team is on point today
Starting point is 00:06:33 let's let's hear for the production team um so uh so yes i i agree with you on the on the energy front um i think ben thompson his point today and fabs yeah and fabs too yeah his point today was like, it was like, open AI needs to be crystal clear about the position that they're in, which is that they are the hottest company in the world. There is unlimited demand for their shares. They could be a public company. They could go raise more private capital. They need to be on the opposite end of the risk curve from the stuff that's like, uh, no one really wants to invest in an American fab that might lose money for a decade. Like, like, there is truly much less appetite for, yeah, let's go and build a nuclear power plant that might take a decade. And who
Starting point is 00:07:22 knows? Like, when you think about the things that make money in the short term, it's SaaS, right? AI for SaaS. Just go and transform the legacy business with some AI sprinkled on top and just start printing money. Like, this is what works. You know, people's concern for government-backed data center lending is that you're lending against chips, which have a really fast depreciation schedule. Yep. We don't know. There was some pushback.
Starting point is 00:07:52 I made the comment, hey, maybe that these things don't depreciate as fast. There was some pushback in the comments. I think everyone kind of agrees that these things depreciate quickly. And so energy infrastructure is the place. Yeah, and if you look at right now, it's core weaves corporate default swaps are now sitting around 500 basis points, jumped up dramatically. and so this is a this is one of the leading neocloud 500 basis points 5% yeah they jumped 5%
Starting point is 00:08:23 uh no they jumped they jumped from like i think like two to five oh okay try to find somewhere in the stack yeah yeah yeah but sorry i don't mean to like ask particular uh stats but uh clearly like people are are worried about this yeah so so and again this is for a leading neocloud right we we uh semi-analysis uh cluster max uh the updated version came out yesterday They're only in the platinum. Neocloud in the platinum tier. And people are worried about them, right? Yep.
Starting point is 00:08:53 Potentially, you know, having some bankruptcy risk. And so if you start doing, you know, basically government guaranteed data center lending, you could get in a situation where there's a bunch of new data centers that come online that really don't have a clear pathway to ROI. Yeah. And it just incentivizes the entire stack to just get, you know, really exuberant. right? And again, going back to Sarah Fryer's interview on Wednesday, she was, she felt the market was not exuberant enough. And I think a lot of people disagree with that, right? There's a lot of, there's been a lot of insanity this year, silliness. Maybe we don't need more. But we will see. The other news that was interesting out of today, the director of the Federal Housing Finance Agency, William Polaro. Fannie and Freddie eyeing stakes in tech firms. Bill Pult, the director of the federal housing
Starting point is 00:09:53 finance agency, said that Fannie Mae and Freddie Mac are looking at ways to take equity stakes in technology companies. We have some of the biggest technology in public companies offering equity to Fannie and Freddie in exchange for Fannie and Freddie partnering with them in our business. Pult said Friday during an interview at a housing conference. We're looking at taking stakes in companies that are willing to give it to us because of how much power Fannie and Freddie have over the whole ecosystem. So, yeah. This, this Wall Street Journal event is just so many articles came out of this.
Starting point is 00:10:26 The Wall Street Journal did a fantastic job, bringing a ton of people together. This is where that Sarah Fryer quote came from. It's also where the Corrieve CEO was on stage. You were mentioning Corweave earlier. And the Corweave CEO, the headline in the Wall Street Journal is, CoreWeef CEO plays down concerns about AI spending bubble. And the quote is from Michael, if you're building something that accelerates the economy and has fundamental value to the world, the world will find ways to finance an enormous amount of
Starting point is 00:10:55 business. And he went on and said, if the economy doubles in size, it's not a lot of money to build all those data centers. And so there's a lot of folks to, you know, addressing bubble concerns right now. He says, it's very hard for me to worry about a bubble as one of the narratives when you have buyers of infrastructure that are changing the economics of their company, they are building the future. And so if you are, if the products are, you know, effective in growing the economy, then all of the investment is worth it. There's a fascinating
Starting point is 00:11:28 mansion story we should get to that's actually related to this. Good. So there is a, before we do, let me tell you about Privy, wallet infrastructure for every bank. Privy makes it easy to build on crypto rails, securely spent up white label wallet, sign transactions, integrate on chain infrastructure all through one simple aviation did you get a whole new soundboard what's going on with the soundboard i did thank you really like they swapped them all out michael do you still have the horse i still got all the classic got all the classic but i'm working with some new material working with some new material i like the sheesh uh can we get the vine boom on there is that on there uh i don't know boom the thud that's a big one uh anyway casa and cantata
Starting point is 00:12:04 is a 1930s estate in los angeles it's one of the most important homes in the 20th of the century In 2023, it was also briefly the most expensive home for sale in the United States with an ambitious asking price of $250 million. What? This is in Bel Air. So it's a Bel Air property. It was sold in the foreclosure auction after the death of its longtime owner, financier Gary Winick. So it's an 8.5 acre property. He bought it in 2000 for $94 million.
Starting point is 00:12:38 He bought it in the year 2000. No way. Do you know who this guy is? Gary Winnick. He had something to do with the last bub. He did have something to do with the last bub. It's one of the craziest stories. So,
Starting point is 00:12:54 To satisfy the debt, which is now grown, blah, blah, blah, blah. So basically, like, it's a 40,000 square foot house built in the 1930s, counts Hotelier Conrad Hilton and Dole Food billionaire David Murdoch among its former owners. It's located right next to the Bel Air Country Club golf course. It has its seven bedrooms, has a swimming pool, tennis court. It's awesome. Anyway, Gary is perhaps best known as the founder of Global Crossing,
Starting point is 00:13:25 which built fiber optic cable, a fiber optic cable network across the world. The company made him a billionaire, but it imploded. Did they have a little bit of dark fiber? A little bit of dark fiber. imploded in the early 2000s under the weight of massive debt. Casa Incantata, Gary's primary home went on the market in June of 2023, five months before his death. Now it's asking $190 million,
Starting point is 00:13:53 and they kind of move on from there. But is there ever a lesson? He was somewhat of a global businessman. He was the headquarters were in Bermuda. You know this? I didn't know that. I wonder why. I wonder why.
Starting point is 00:14:07 I mean, it might literally because it insulates the assets in America. Like, that is one thing that you can do if you don't want to have to give up your lovely home post-bankruptcy. You want to be able to get liquidity before the bubble pops. That's the lesson here. Sell the shares before the top. Buy low, sell high. That's the phrase that we live by here. Was there anything else, Tyler, that came out of your deep room?
Starting point is 00:14:37 research report on Gary Winnick? Did you chance? I mean, so let's see. How do you tell this? He also started Pacific Capital Group in 1985. That was kind of the precursor. So he was set up to to actually marshal all the debt to do the big infrastructure building. He had been a global businessman for a while before. Okay. Nice. Let's give it up for global businessmen. Let's give it up. Yeah, what a wild, wild story. We're hyper, hyper local. businessmen. We like to do our business right here in the Ultradome, but we do. Got to give it up for them. Also in the mansion section, which we should continue on. But first, let me tell you about cognition. The makers of Devin, the AI software engineer, crush your baglog with your personal
Starting point is 00:15:23 AI engineering team. You have a new neighbor, Jordi. You have a new neighbor. So Tom Petty apparently lived in your neighborhood in Malibu, California. The late free-fallen rocker had a personal music studio. Some deep lore. Please. I saw Tom Petty. I believe it was the first ever. Who headlined the first ever outside lands?
Starting point is 00:15:49 Was the first or second? He was living in Malibu in a $11.2 million house, 8,744 square feet, seven bedrooms as a music studio. The buyer is Stephen Slade Tien, a psychoanalyst and author, according to people with knowledge of the deal. TN didn't respond for a request for comment. I wonder if he'd want to get beer sometime, hang out, maybe go surfing, go for a walk in the beach. We should reach out to him for comment on. Okay, so I was at the first ever outside lands. Really?
Starting point is 00:16:29 How old are you? I thought you were born. It was basically the first ever. like major concert experience. I didn't realize. And so Tom Petty headlined on Saturday. This was in August of 2008.
Starting point is 00:16:44 And that was my first time smelling cannabis. And I kept, I was there with my, my friend and his parents, and I kept asking, like, his friend's parents, like, what is that stinky smell? It's so stinky. We can't get away from it.
Starting point is 00:17:02 Yeah. Yeah, that's hilarious. And very, very memorable. Very memorable. That's amazing. Do you know where this is? 2.6 acres above Escondido Beach. Is that close to you?
Starting point is 00:17:13 Yeah. It's a few minutes away. It's a gated property. Escondido Beach is the most underrated beach. It was Petty's home for decades before his death in 2017. Lead singer of the Heartbreakers purchased the property for about 3.75 million in 1998. Petty turned his guest house into his personal. music studio with soundproof rooms for recording music and said Levi Freeman who's putting up
Starting point is 00:17:38 there's a one-bedroom guest suite seems like a very nice story he was from Florida he shunned the spotlight off stage he's a member of the rock and roll hall of fame best known for songs like American girl and I won't back down what a what a lovely little house well that's always fun anyway we should move on to our top story should we should we do you want to go through the Elon pay package thing a little bit? I had two questions for you on that, and then we can go into our MAG7 review. Is that sound good? Let's do it. Okay. First, let me tell you about Figma. Think bigger, build faster. Figma helps design and development teams build great products together. I really enjoy this graphic package we got. This is great. So, Elon's trillion dollar pay package
Starting point is 00:18:28 is done. It's signed. It's approved. I'm sure it will be tested in the courts. It's always contested in the courts. But the Wall Street Journal is a very nice little breakdown of how it works. They have a nice little infographic here I can share. And it kind of shows this is this is this is what technology podcasting is all that up. It's a holding hold that up. Pull that newspaper. Yeah, yeah. So basically Elon could get one trillion in Tesla stock if he hits all these different tranches. And so it's actually not that many shares. So he's worth half a trillion now, but he also owns 414 million Tesla shares outright, got another
Starting point is 00:19:14 award in 2018 of 300 million shares, and this next award is 424 million across 12 tranches. So it's not like they're giving him twice as much as he already has. They're just kind of giving him a little, like basically what he already had. they're giving them the same amount again. And there's a bunch of things that he has to do. He has to get the market cap really, really high. And then there's also these like qualitative operational goals, or I guess they're quantitative, but 50 billion in EBITDA,
Starting point is 00:19:44 20 million cars delivered, one million robots sold, one million robotaxis in operation, 10 million full self-driving subscription. Now, some of those are obviously more gamable than others. What's the definition of a robot? If he comes out with a really cheap robot and he sells a bunch of those because it's more of a toy, does that really fulfill the goal?
Starting point is 00:20:01 Or like, what's the, what's the, you know, million robotaxes in operation? What's the definition of a robot tax? Yeah, what qualifies. Is it just a Tesla that is enabled for... Yeah, and I turn on FSD and my friend rides in it for one day. Is there anything for actual rides? There's 10 million full self-driving subscriptions.
Starting point is 00:20:19 Yeah. And so some of these are more gameable than others, but the market map really isn't. How many full self-driving subscriptions are there today? I saw, I looked that up. It's somewhere between like one and three. three million right now. So he has to, he definitely has to like triple the size at least. Robotaxies obviously goes from like zero to one million because there's barely any on the
Starting point is 00:20:38 road. He hasn't sold any robots. So a million would be entirely new robots. He's obviously delivering a lot of cars. And on the EBITDA front, 50 billion in EBITDA, company did like 13 last year. So that's, that's a huge increase in EBITDA. I mean, 50 billion and EBIT does a lot of money. But he's, you know, it's not, it's not 20 X where he is right now. And neither is the market cap. Like he's he only has to take the market cap to 8.5 trillion and Tesla's already worth a trillion. So it's it's it's within you know striking distance. The market the market cap is now around 1.5 trillion actually. So my two questions were one like it's going to be weird to live in the world of the trillionaire like what we are getting close like that's going to happen not just within
Starting point is 00:21:22 our lifetime like definitely within the next decade. This sets him up to be the first one but it's going happen. And I wonder how that's going to reshape our culture, like the world in America, because when I had this realization that when billionaires became so prevalent and prominent, there was a lot of heat that was taken off the millionaire. Like, if you're just like a guy, yeah, I have an HVBillionaires. Billionaires have a heat shield. Yeah, exactly, exactly. Like, yeah, I'm a millionaire. I have a boat. I go to Bass Pro Shops. But I'm not getting protested. Because I have a million dollars in my house and boat and practice. One out of ten Americans is a...
Starting point is 00:22:03 Yeah. Yeah. So the millionaire became more accessible. And the billionaire became the thing that the society scapegoats for all the problems. Yeah. Approximately 9.4 to 9.5% of American adults are millionaires. Yeah. But my question was, what do you...
Starting point is 00:22:24 Like, what happens to the billionaire when trillionaires come in? Because, you know, like Bernie Sanders and there's a whole crew that say, like, billionaires shouldn't exist. Every billionaire is a policy failure. Well, like, what happens when you have to say, like, well, like, trillionaires are the real policy failure, but billionaires are also the policies failure. And millionaires were like kind of okay with, but it's not great. It's like, it becomes much more complicated. But at the same time, it definitely, like, if Elon is the only trillionaire, it's going to be really, really easy to target him and be like, he's bad. He's a trillionaire.
Starting point is 00:22:52 He's a bit more targeted? I don't know. Yeah. Maybe he maxed it out already. But I thought that was interesting. And then the flip side was, what does this mean for other companies? What does it mean for Sam Altman at OpenAI? Can he run a similar playbook?
Starting point is 00:23:07 It's clear that he had a ton of soft power during the Open AI coup. Could he go to the OpenAI for-profit board and say, hey, if Open AI IPO is at $2 trillion, I want 20%. If Open AI moons to $10 trillion, I want 50%. Like, how extreme can Sam get? We know that Sam runs a bit of an Elon playbook. They were in business together. They co-founded Open AI together. So clearly they learn from each other.
Starting point is 00:23:34 I wonder what Sam Altman can do similarly. And then I also wonder what will happen at the Garden Variety Unicorn. If you're just the CEO of a $5 billion company and you're just kind of hanging out there and you say, like, yeah, I had 30% of the company when I started. diluted down to five or ten, but I like this company, and I want to get it up. Like, what if I say, hey, could I go to the board and say, okay, we're at $5 billion now. If I get us to $50, will you double my equity position? And how would shareholders treat that?
Starting point is 00:24:10 How would Sequoia treat that, our founders fund, or Kleiner, or A16C? Like, how would the growth stage venture companies, the venture capital firms, deal about that? So, I don't know, any reactions to that stuff? I think there's a sentiment, like there's, you know, any venture-backed founders like going to be hyper-conscious of dilution, right? There's a sense that it's like one way. Yes. Right?
Starting point is 00:24:35 It just goes down and down and down and down. And I think the right way for founders to think about that is like, okay, you're not actually, like, no one's taking your shares unless you decide to sell them. Yes. Your job is just to make the share price go up. And there's going to be more shares issued over time. Yeah. But if you just make the share price go up forever, it doesn't really matter.
Starting point is 00:24:54 And you can also buy back. You can also, you know, get. Buy back like Drew Houston. Yeah, Drew Housden is the best example. And then give more share, create new shares if you want to get your percentage ownership way down. Yeah. Like if you want to go to zero. More like, if you want to bail.
Starting point is 00:25:11 I mean, yeah, we need to do, we need to do an analysis of the most deluded CEOs in the public markets. because if you look at some of the IPOs from the last 10 years, like some of the guys that are still hanging around running these companies have sold down so much of it. And this is what makes Larry so admirable. Yeah, because just buy back, buy back, buy back. And he's the second richest person, $300 billion. It's remarkable.
Starting point is 00:25:39 By the way, I think Oracle is like fully round-tripped now from. It happens. Well, what about the, so, yeah, I mean, the question is, Like, on what time scale do you think this happens? Like, Jordy, do you think we'd actually hear the story of somebody, a CEO founder, maybe past their vesting cliff, maybe one of their co-founders has left? Because I think about that a lot, where it's like, okay, yeah, there were like two or three people.
Starting point is 00:26:07 They were basically equal. They did their full, like, four-year earn-out. But then there's clearly one that's, like, still there grinding for the next decade. Like, they kind of do deserve more. It's not that crazy. And yes, there is just the stock buyback don't sell, but is there a world where someone like Drew Houston goes to the board and just says like, I think I can five-ax this and I want the pay package to do it. And I'm going to be in the office nonstop. And I'm going to go more time. I think that happens all the time. Not to this degree. No way. Not okay. Not to that degree. Yeah. No, no. Not even the headline number. Just in the sense of like we're going to dilute everyone like five or ten percent. If this happens, like a trillion on 8.5 is serious dilution for the rest of the shareholders.
Starting point is 00:26:54 But if I'm holding at 1.5 trillion and I'm like, you're going to take me that 8.5 trillion? Like, I'm totally in for 10% dilution. You're going to 5x my shares. Like, I'm in. But what's interesting is that we just haven't seen other CEOs pull that from the Elon playbook and say, I'm doing it too. Was it, was it Kimball or someone else saying like almost no other CEOs would take a deal like this? because it's so ambitious. Yep. And so I think it's healthy. Yeah, no, no.
Starting point is 00:27:21 I think people are going to, people are offended by the headline number. Totally. Yeah, yeah, yeah. Yeah. So, I mean, obviously, I'm very pro this. I think we're all pro this. That's not the, that's not the discussion. The question is like, how widespread does this become?
Starting point is 00:27:35 Does it become a medic? Is it like every C.E? Because there's a lot of people that are just like, oh, Elon did it this way. I want to do that way. You know? And I'm wondering how much it actually spreads. Anyway, we'll have to keep monitoring it. We'll also have to tell you about Vanta.
Starting point is 00:27:48 automate compliance, manage risk, and accelerate trust with artificial intelligence. Vanta helps you get compliant fast. And we don't stop there. Our AI and automation powers everything from evidence collection to continuous monitoring to security reviews and vendor risk. What else is in the timeline? There's so much in the timeline. Should we do our mega cycle review? Let's do it.
Starting point is 00:28:11 I got the laser pointer. We got the laser pointer. Our good friend Tyler Cosgrove has put together a, slide deck for us that tries to help map the mag seven really i call it the tbpn top 10 the top 10 well technically nine there's nine well no no there are 10 there's 10 there's 10 we'll see well one of them maybe is shouldn't be there no no no no no there's actually 10 because it's the mag 7 oh oracle i forgot oracle yes you did so so it is it is the tbpn top 10 uh the 10 most important companies in ai loosely The MAG 7 plus a few bonus ones.
Starting point is 00:28:51 And we're going to try and take through and look, you're got to hit the horse. You got a hoof. And so we're going to try and go through the various companies and rank them based on how AGI pilled they are and how much they need AGI. Is that right? Yeah, basically. So let's pull up the actual scales. There's two axes. Let's go to the next slide.
Starting point is 00:29:12 So basically on the horizontal, we have how AGI pilled they are. So I feel like that's fairly self-explanatory. you kind of believe that AI will become something like it can produce the median economic output of a person. Yes, yes. And there's a few different ways to understand if somebody believes in AGI. It could be the rhetoric of the CEO or the founder.
Starting point is 00:29:33 Sure. It could be the actions of the company. Yeah, yeah. Actions speak louder than words. And is there anything else that could lead someone to show that they believe in AGI? I guess it's mostly just the actions and the words, right? I mean, those are the main things that you can kind of do as a person say or do things.
Starting point is 00:29:54 Well, we have it then. We will be judging them by both their actions and their words. So then on the vertical axis, we have how much they need AGI. So I think this is maybe a little harder, so I want to qualify this. Yeah. So, I mean, this doesn't necessarily believe that you have this kind of sentient, you know, AI that's as good as a person. Yeah.
Starting point is 00:30:13 But I think it more so in this context just means that AI will continue to become more and more economic valuable. Yes. To where you can kind of sustain, you know, building more and more data centers. You can do more and more CAPEX. Yeah, there's a little bit of like, how much will this company be transformed if the AI wave plays out well? If AI doesn't play it play out well, how chopped, how cooked are that? Exactly. That's a great framework. Yes, and then also, yeah, if we, if we flash forward, nothing really changes, total plateau, total decline in token generation or something, it, it's, Is the business just continuing business as usual? Okay, so who are we starting with?
Starting point is 00:30:52 So let's start with Sam Allman. Okay, Sam Alman. Where is he on this? So Sam Alman, I think this is a pretty reasonable spot. Sure. He believes in AGI, right? He runs kind of the biggest AI company. He also needs AGI.
Starting point is 00:31:05 Because if you imagine that, you know, if the models stagnate, right, they have a lot of CAPX, they need to fulfill. If models stagnate, like, what are they going to do? Maybe they can, maybe the margins somehow work out. but you're probably not in a good spot if models get worse or if people start using AI less if you're saying a lot one.
Starting point is 00:31:22 But he's also not in the top, top corner. Okay. Right? And I think this is, you can justify this through a lot of opening eyes actions. You see stuff like SORA. You see maybe the erotica.
Starting point is 00:31:33 This is not very... Who's running the laser pointer? I'm running the laser pointer. Okay. You want them here? Maybe put him down here. Okay, explain that. Explain that.
Starting point is 00:31:42 Okay. I think that more and more, at least in the short term, Open AI looks like a hyperscaler. They're kind of a junior hyperscaler. And I think their actions are more, you know, I think that Open AI, a lot of people, you know, want to say that they're bearish on Open AI and current levels. But ultimately, when you look at how their business is evolving,
Starting point is 00:32:06 they seem to me like they'd be fine if the model's plateaued. Yes, but, yeah, I feel like the mood on the timeline was much more slide Sam to the left, doesn't believe in AGI, there was that post about, like, if Open AI really believed in AGI, they wouldn't be doing SORA, or they wouldn't be doing the erotica thing.
Starting point is 00:32:28 Like, all of those were very much, like, needs it, but kind of accepts that it's not coming, and so stop believing it. Yeah, that was the risk. I think, in response to Judy, I think, because it's not, like, this, the vertical axis is also just about, like, if it's going to continue to be economically useful.
Starting point is 00:32:45 So if people just stop using AI in general, or like people, if the kind of, you know, revenue stops accelerating or any of this stuff, I think open eye will be in a bad spot regardless of the models, like, actually getting much better. Like, if they just make the models much more efficient to run, you could say that's not very like AGI-I-pilling because the models aren't getting a lot better. But that's still like... But I'm just saying, like, if there was no more progress at all. Yeah. Like, we never got a new model for many of the labs. I think that Open AI would add ads. They would add commerce.
Starting point is 00:33:19 They would increase descriptions. Yeah, so they might be fine without AGI. They would make agents a lot better. But, I mean, the $1.4 trillion. The $1.4 trillion in commitments. Like, that is hard to justify if it's just the business today, just growing like it's growing, because it's just like it's good. crazy breakthroughs. Like, just, like, you really, they're, they're, you're going to, you're laying
Starting point is 00:33:43 out the bull case. They're playing, they're playing, they're playing, they're playing. You're laying out the bull case saying, oh, if they just add ads, they're going to be able to hit the $1.4 trillion, no problem. That's amazing. I'm not saying, I'm not saying, they, they can pull back on a lot of these commitments. Sure, sure, sure. They don't, like, I don't, I don't think these are, like, going to end up being, like, real liabilities that the business is just cooked. Okay. They can't hit them, right? So, I'm just saying, like, I'm just saying, like, I think there's a shot they have they could you know when when they talk about having success in consumer electronics right which is something you talked about yesterday like they don't need like
Starting point is 00:34:20 I I think they can probably build a really cool device maybe it could be competitive with you know if if there's if it can be at all competitive with an iPhone like that they could be fine without AGI right also getting into like I guess let's get some other people on the board so We can see where people split. Yeah, I think it's useful to be relative because these are not quantitative numbers. Yeah. So next we have Dario. Okay.
Starting point is 00:34:44 Whoa. So Dario is up here. So Dario is kind of, when you listen to what Dario is saying, he is, you know, he's extremely AGI-pilled. Yes. Right? He, this is kind of the reasoning why he's so anti-China, right? Because he sees it as an actual race. This is going to see what intelligence.
Starting point is 00:35:01 It is a national, you know, security. Totally. It's a problem if China gets there first. What is that new sound cue? I don't think Tyler has sound effects. What is it? UAV online. UAV online.
Starting point is 00:35:12 Okay, I like that. This is the UAV. We should give us UAV aesthetics for sure. This is good. Okay, continue. But there's also a sense that he needs AGI. Yeah, yeah, yeah. Because if AI stops becoming, if it stops growing as fast, and you imagine that things
Starting point is 00:35:28 kind of settle where they are now, open AI is definitely in the lead. Sure. So you need a lot of continued growth for Anthropic to keep kind of making sense. sense economically I think. Okay. Yeah, yeah. Sure. Who else is on here? So I think next is Larry. Larry L. L.A. Yeah. Larry's in kind of an interesting spot here. So this is kind of a weird place to be where you don't believe in AGI, but you need it. Okay. How did you wind up there? So I think... You're probably wondering how I ended up in this corner. Record scratch, freeze room. So I think this is how you factor in. There's kind of the personal rhetoric,
Starting point is 00:36:01 and then there's the actions of his company. Sure. So when you listen to Larry speak, he doesn't seem the type that believes in some kind of super intelligent God that is going to come that's going to birth this new thing and humanity will rise but then you look at Oracle Has anyone found his less wrong username?
Starting point is 00:36:20 Is he on there regularly? I don't think Larry's reading Gwern. Okay, got it. But you look at Oracle and they are you know, they need AI to work. They are maybe they're levered covering up a little too much, or maybe not enough, depending on how AGI-I-pilled you are, but
Starting point is 00:36:39 you know, he's not very AGI-I-pilled, so it's kind of hard to square, but... It's a bold bet. Yeah, this is a unique spot, I think. It is, this is a unique spot. He's off the grid, okay. Yeah. So, who's next? So let's see, who's next.
Starting point is 00:36:51 Who did I... Okay, Satya. Okay. There we go. I think this is a fairly regional spot. Yeah. Obviously, there's, you know, there's some sense where he's slightly EGI-I-pilled, or maybe more than slightly.
Starting point is 00:37:02 He believes in the power of the technology. Yeah, I mean, he's very early on Open AI. He thinks that AI in general will become very useful, but maybe it won't become super intelligent. Maybe it's not going to replace every person. It's just a useful tool. The quote I always go back to is him saying, like, my definition of AGI is just greater economic growth.
Starting point is 00:37:19 So show me the economic numbers, and that will be. It's like, it's a very practical definition. I think people see him as very reasonable. He's not getting ever skis. Yeah, I like him in the center. I like him in the center of the grid somewhere. That seems like a real. He's also, you know, if AI doesn't work out, I think Microsoft is in a very good spot.
Starting point is 00:37:39 It's going to stick around. You know, they're not crazy over-investing Open AI. They have a nice share of it. Yeah. They'll write some code. If Open AI works out, he'll do very well. If they don't work out, I think he's also, he's doing quite well. He's hedged.
Starting point is 00:37:51 He's doing a leaser. Yeah, that is a good quote, too. Okay. Yeah, you wouldn't be leasing if you were super AGI Pilled, right? You're hoovered up everything. You keep it all for yourself. I think, let's actually talk about that with, I think, I think it's Jensen next.
Starting point is 00:38:04 Okay, yes, Jensen. Wait, he doesn't believe in AGI. He's the whole reason why we're here. Explain that. So, yeah, this is maybe my personal take. Please. If Jensen was very IGI-pilled, yes.
Starting point is 00:38:14 I mean, he is the kind of, he's the rock on which this all is built on. Yes. He has the chips. Yes, he has the chips. If he was AGI-pilled, he would not be given out to those chips. He would keep them all to himself
Starting point is 00:38:23 and he'd be training his own model. Okay. So that's why I think he's more on the, he doesn't believe in AGI's side. But if there's any kind of downturn in the AI economy. Could you put Sam, like, potentially closer in this direction, too, because he's talking about getting into the, like, compute reselling? Yeah, yeah, yeah.
Starting point is 00:38:42 So it's like if there's so much demand and the models progressing so quickly, wouldn't you want to just hold on to all that compute yourself? I think this would also be interesting to see over time. Like, Sam has definitely shifted leftwards over time. He's moving. Basically, this summer, you've seen a lot of the actions of opening eye. They seem less and less EGI built. I mean, pretty much everyone has been, like, moving the. AGI timelines outward, which you could transform into no longer believing in AGI.
Starting point is 00:39:08 It's more just like the timelines have gotten longer this year, broadly. Pretty much everyone. Yes. There was a new blog post yesterday. It was basically AI 247. There was a new one. It was AI 2032. So it's basically very small arguments.
Starting point is 00:39:24 Different team, but the team of AI 227 was promoting it. Did, yeah. AI 27, 27 should be like, it was actually. AI 3027. We're off by a thousand years. Typo. We're AI 3027. We couldn't get that domain.
Starting point is 00:39:41 We're just off by 1,000 years. Continue. Yeah, but there's definitely the sense where if AI, there's a downturn, Jensen, the stock is going to go down. But it's not going to go down as far as Larry, right? Because they're still not on it. They don't have insane. This is not financial advice.
Starting point is 00:39:56 Yes. You know, they don't have, you know, completely unresible CAPX. They're not levered up that I know of. Oh, yeah. Their Z score is through the roof. Their Allman Z scores through the roof. They're looking pretty safe. Okay, let's see the next one.
Starting point is 00:40:12 I believe, okay, Sundar. Sundar. He believes in AGI more than Satcha, you think? Yeah, well, I think you can see this in kind of, they were even earlier in some sense than Sotia, right, with deep mind. So I think AI has played a fairly big part in the Google story very recently. They've always, I think basically for the past 10 years, they've been trying to get into AI.
Starting point is 00:40:34 Maybe their actions didn't actually do much, but, you know, they were the transformer paper. They're about applied AI. Like, they're actually applying AI for scientific discovery. Core Google search is just an AI product. It's just an index on information. They're organizing the internet. Also, I mean, compared to Satya, I mean, Gemini is a frontier model.
Starting point is 00:40:51 Totally. Gemini 3 is supposed to be this incredible model. Everyone's very excited. Sotia is not, Microsoft is not training their own base model yet. Yeah. And also, like, it's, like, there is a little bit of, like, if you really believe in AGI, the actions that we see are you, like, squirming and being like, I got to get in. It doesn't matter if I'm 1% behind or 10% behind or 80% behind.
Starting point is 00:41:14 I got to get in. And I think we know someone who's doing just that. Gabe, in the chat says, is this AGI or AG1 from Athletic Greens? Needs AGO1. Needs AGO1. Fine without AG1. Believes in AG1. Now, we're the only podcast that is not partnered with AG1. But we do like green. Green is our hero color. Let's go to the next person. Okay, but also before then,
Starting point is 00:41:40 I think Sundar is also definitely below this line because, you know, Google has been doing very well. AI was, at first, I mean, people thought of AI as like, oh, this is going to destroy Google. This is bad for Google. So if AI doesn't work, then Google is just in the spot they were before, which is doing very well. If AI does work, then, I mean, Gemini is one of the best models.
Starting point is 00:42:01 They'll do very well, too. Gemini, Google AI Studio, create an AI powered app, faster than ever. Gemini understands the capabilities you need and automatically wires them up for you. Get started at AI. Studio slash build. Yeah, continue. Okay, so I think, yeah, next is Zuck. Yes. So, so Zuck is also kind of an interesting spot. Yes. I think, I think Zuck is actually someone who has shifted right word. It's fascinating. Yeah. So you've seen this, basically, I mean, for a while. He's been traveling. He's been traveling. Yes. Yeah. So for a while he was doing open source, which in some sense, it's very AGI-I-pilling because, you know, you're building it, you're training a model, right?
Starting point is 00:42:38 It's like you're moving the frontier forward. But it's also, it's open source, which you can kind of think as being, you're trying to commoditize everything. You don't think that there's going to be some super intelligence that will take all the value that you need to hold on to. You can kind of give it out. And now he's kind of moved towards close source. We're going to get the best people. We're going to train the best model. It felt like Zuck was sort of like, Oh, yeah, AI. Like, it's this cool thing. I'm going to check the box.
Starting point is 00:43:03 I got my team. We did this fun little side project. It's this open source model. We kind of found our own, like, little lane. But we're not, like, competing in the big cosmic battle between open AI, anthropic, deep mind. Like, we're not playing in that realm. Do you think that was just a counterposition to, like, try to win the AI word? Go say, hey, we're just going to try to commodify this market and like to make the Chinese approach?
Starting point is 00:43:30 Yeah, commoditizing your company. It's a good strategy. Yeah, that makes sense. And then maybe you could say that, oh, once the Chinese have been getting better, he can kind of step out of that position. Sure, sure, sure. And he's moved towards close source. Yeah.
Starting point is 00:43:40 But now it feels like he's way, way, going way harder on the AGI vision, paying up for it, investing so much money in it. You know, it's like, and depending on how much he invest, you could see him neat pushing up. It's true. Yeah, he's also moving up. But right now, the business is just insane.
Starting point is 00:43:58 It's so phenomenal. that even if he winds up spending all this money and they don't even get a frontier model and like all the researchers like yeah we didn't discover anything and we're just gone like the business is fine because it's such a behemoth. So that's why he's fine without EGI.
Starting point is 00:44:15 After earnings they took a fairly big hit so maybe he should be a little bit higher. Yeah, maybe a little bit. Meta broadly is still a very safe play if AI doesn't work out. And it was the same thing during the Metaverse. It was like he believed in the Metaverse, he invested in the Metaverse,
Starting point is 00:44:28 but he never needed the Metaverse. And so after the stock sold off like crazy, it went right back up because everyone realized, but he's also raising debt off the balance sheet, which would kind of could push him up into this zone, right? If he's like, you know, if you're worried about, why don't you just carry it yourself? Yeah, yeah, yeah. You carry it yourself. What do you, really, you believe in AGI, just, just let it ride. Yeah. Okay, who else are we missing? Okay, so I think next is, is Is Jassie? Oh, Elon, Elon.
Starting point is 00:44:58 Elon. Yeah, yeah. Okay, Elon. So, Elon, I mean, he's been AGI-I-pilled, I think, for a very long time. Super AGI Pilled, I agree with that. Open AI co-founder. Even before that, I think he was fairly big in the safety space. Totally.
Starting point is 00:45:10 You see him, even on Joe Rogan, he was talking about AI safety. He still believes it. He doesn't back off. And AI safety is important because it's going to become super intelligent. It's going to take over the world. Totally. Totally. Totally.
Starting point is 00:45:19 Yeah, even this was part of the dialogue around the new comp package, wanting the voting power to be able to be part of, of securing his robot army. Oh, yeah. Interesting. Yeah, I think robots are very underrepresented on this board. Totally. And Elon is kind of the main player in that space right now. Yeah, so he talked yesterday about humanoid's being sort of an infinite money glitch.
Starting point is 00:45:46 And I feel like you kind of need AGI in order to kind of unlock the infinite money glitch. Yes, but at the same time, very strong core business. The cars don't need AGI, the rockets don't need AGI, Starlink does need AGI. So he's not entirely indexed on it in the way the foundation labs are, right? Yeah. I mean, there's definitely a significant part of the Tesla market gap that is basically betting on robots. Totally. But there's also a big part of it that's just based on the car sale.
Starting point is 00:46:15 He's in the middle of the needs are fine without. It's like, he is betting on it. He is betting on it. Is it needs. Yeah. But that's just one piece of the E. AGI more than SpaceX. SpaceX.
Starting point is 00:46:27 Yeah. Yeah, SpaceX is very, seems very uncorrelated with the AI. Totally, totally, totally. Unless you have the data. He's happy to be a telecom baron. He's happy to, yeah, he's happy to be the new Verizon. Who else is on here? Okay, so I believe now is Andy Jassy.
Starting point is 00:46:41 Jassy. Yeah, so he is, I think broadly does not believe in AGI, although he, you know, fairly major stake in Anthropic. Yeah, yeah. Which maybe is very AGI playing, but they have not. He's hedging with Anthropic. Yeah. Yeah, I think that's basically what you can say.
Starting point is 00:46:56 He doesn't seem. at all worried about kind of the core business. AWS seems to be... And isn't Google building their position in Anthropic? I saw some headline about that. Yeah. That was wild. Imagine. Like, Sundar, such a beast.
Starting point is 00:47:09 So they haven't sought you, really. Both of them. They just have huge stakes. Broadly, Andy Jesse seems very quantitative. He's focused on the numbers. Sure. Realistic. He's not making these, you know,
Starting point is 00:47:18 grandiose statements about the future of intelligence or what humanity is going to be like. Yeah. When you look at the AWS earnings, it feels like they invest when the demand shows up and they build data centers when there is demand. And they are not in the business of doing, you know, a 10-year hypothetical sci-fi forecast based on breakthrough. So Ben Thompson on the recent Sharp Tech, we were listening to it on the way in. He was talking about how Amazon has repeatedly said, hey, we're supply constrained. We have a lot more demand than we can
Starting point is 00:47:51 fulfill with this new deal with Open AI, the $38 billion, which only got announced Monday. which again feels like forever ago at this point. But Ben was kind of hypothesizing that they kind of let Open AI jump the line because that $38 billion deal was starting effectively immediately versus some of the other, like Larry's deal with Open AI was announced, this like massive backlog is revenue that cannot be generated in a meaningful way today because they have to build the data centers that can actually deliver the compute.
Starting point is 00:48:28 Yeah, I mean, and, you know, Amazon has been building data centers. They're building data centers, they're basically, they built all the data centers for Anthropic. But it still seems very, kind of restrained. It's not overly ambitious. Anthropic has had issues with capacity, and it's probably because, you know, Andy Jassy doesn't want to get over his skis, he doesn't want to build too much. So I think that's why
Starting point is 00:48:46 he's kind of on the left side. And also, I mean, if you think AI doesn't work, and we're going to kind of be in this, you know, the same spot of Web 2.0. You need you need AWS, you need your EC2 server. I think he's very well positioned there. And then I think the last one is
Starting point is 00:49:04 Tim Cook. Tim Cook. Whoa. Let's go. Let's hear for Tim Cook. Criminally underpaid, but has done a fantastic job not getting over his skis. Yeah. This one I think is
Starting point is 00:49:17 fairly self-explanatory. I mean, he seems... He doesn't believe in AGI. He doesn't believe in LOMs. doesn't believe in chatbots, apparently. I don't know. The new Gemini deal signaling we had... As of two years ago, Apple was like, yeah, we're not doing that stuff.
Starting point is 00:49:36 Yeah, there was that famous stuff. But I think what's under-discussed, my takeaway from our conversation with Mark German and Apple's ambitions around their new LLM experience with Gemini is that I think that there's a very real scenario where Apple, like, wants to compete for the same, user base as open AI. They want that, like, transaction-based revenue, that commerce revenue. Yeah, I disagree on this point, but... I'm not saying that I'm putting their odds at, like, winning that market at over, like, 10%.
Starting point is 00:50:12 But I think that they would be right to realize that it's potentially an opportunity. Yeah. Is it a billion dollars a year that they're paying Google for... I believe that's the number, yeah. That feels really low. doesn't it? I don't know. It just feels super low to me
Starting point is 00:50:29 among all the different deals that are going on. When I think about like, when I just think about like the value of AI on the iPhone and you're like one billion dollars, like when I think about like, what is the value that we're,
Starting point is 00:50:44 that the market broadly is putting on like AI for the enterprise, AI for whatever. When Sundar on the Google earnings call was talking about their top, I think 10. customers and how many like trillions of tokens they were using but it really was netting out to like $150 million of like actual revenue and so this is this will be their biggest customer for Gemini
Starting point is 00:51:06 on day one until one of our listeners gets on of course yeah and I think I mean even with the with the Gemini news Apple still seems very kind of reactive to AI they're not kind of seeing oh this future where AI is going to do everything and moving there right now they're they're kind of seeing where the demand is, seeing where the users are, and then moving, which I think is very kind of, you know, classically, you know, that's how business work. It's not very AI-pilling. Also, on that point of $1 billion, like, I have no idea, how can they possibly know the amount of inference that's going to happen in Siri plus Gemini over a year period? Like, there's just no way to predict that. Or can they? Like, I feel like if the integration's
Starting point is 00:51:54 good. There will be a ton of queries. It will, it'll, it'll, I didn't read the one billion headline, chat can correct me if I'm wrong, but I didn't read the one billion dollar headline. I felt like that was a technology license, not necessarily inferences. And then there might be consumption on top of that. Yeah. Okay. Or maybe, or maybe, yeah, yeah, maybe they're licensing Gemini and then they pay per query for like the energy and the capax. Like the fully couldn't a lot of the stuff be done on device? For sure. Yeah. Because Gemini, uh, Gemma has like, bake down models that could be smaller. So that's possible.
Starting point is 00:52:27 Chad says, where's Tyler on the chart? Feels like Tyler isn't an AGI-I-pilled anymore. Let's figure out. I actually am on the chart here. Let's go to Tyler. I am over here. So I, yeah, I think I'm very AGI-I-pilled, right? You know, I'm ready for the Dyson sphere.
Starting point is 00:52:43 Yes. I think it's, you know, only a matter of years, handful. Only a matter of years. It's a couple thousand, 100,000 days away. You need to, you need a publisher. 100,000 days away. Yeah, 20-26. Yeah, 2025.
Starting point is 00:52:53 We still have a month left. We still have a month left. We still have a month left. AGI on Christmas, this Christmas. It's coming this Christmas. This holiday season. But why do you need AGI? I think I also need AGI.
Starting point is 00:53:04 Why do you need AGI? Well, I mean, look, if you look at the current jobs data for, like, college aid students or post-grad, it's looking pretty bad. It's bad. So I think you kind of need AGI to really boost the economy. If AI does not work, the macro economy is looking not good. Oh, sure, sure, sure. So I feel pretty bad about my job outlook without AGI.
Starting point is 00:53:27 Sure, sure, sure. Even though you're already employed. That's hilarious. Well, thank you for taking us through. Chat wanted to know where would you put Lisa Sue? AMD. Yeah, so Lisa Sue, I think she's broadly been quite reactive, actually. You've really only seen AMD start kind of making.
Starting point is 00:53:46 You don't get credit for being reactive is what you're saying. Yeah. I think it's really only over the past year. maybe that she's been making any kind of deals, right? You saw George Hots, maybe a year or two ago, basically trashing AMD chips for how bad they were in. But maybe she's given so much of the company away. Maybe she thinks that shares won't have very much value in the future. She's like, happy to just give away 10% of the company. Yeah. So I think I would, if I had to pick somewhere, I would say that she is, she's honestly getting a little close to Larry in that,
Starting point is 00:54:22 AMD, it's very much like an AI play still, right there, chip company, obviously. But you don't get the feeling that she's a true believer. Yeah. Yeah. That tracks. Okay. That track. The new Siri, I'm reading through some of Mark German's reporting.
Starting point is 00:54:38 Speaking of Apple and AI, the new series plan for March, give or take, as has been the case for months. Apple has been saying for nine months, it's coming in 2026. Apple simply reiterated on that. And then on the actual deal, it's a $1 billion deal. for a 1.2 trillion parameter artificial intelligence model developed by Google to help power an overhaul Siri AI voice assistant. The two companies are finalizing an agreement
Starting point is 00:55:03 that would see Apple pay roughly $1 billion annually for access to Google's technology with the Google model handling series summarizer and planner functions. Apple intends to use the Google model as an interim solution until its own models are powerful enough and is working on a one trillion parameter cloud-based model that it hopes to have ready for consumer applications
Starting point is 00:55:25 as soon as next year. That is interesting. I feel like they're going to have to pay a lot more to actually run all the queries, generate all the tokens. But I'm sure we'll find out more as Google reports earnings and Apple reports earnings.
Starting point is 00:55:42 Before we get to that, let me tell you about graphite.dev. Code review for the age of AI. Graphite helps teams on GitHub ship higher quality software faster. What else is in the news today? We have some breaking news through the timeline. What's the breaking news?
Starting point is 00:55:58 T.J. Parker has finally found a new car that he likes. Huge. Guess what it is? Guess what it is. A new car that he likes. It's not the Cerrado. It's not the Lamborghini Huracan Starado. No. It is the Ord Raptor R.
Starting point is 00:56:15 Look at this. Look at this. Finally found a new vehicle I quite like and great gas mileage to boot. This was the problem with your Ford Raptor. It wasn't the R. You only had 600 horsepower instead of 8. That was the main problem. I also needed to park it in cities.
Starting point is 00:56:31 Oh, yeah. Which is just absolutely brutal. Yeah. What are the specs on the Raptor R again? It's a pretty wild. How much horsepower? Raptor. Zero to 60 and 3.9.
Starting point is 00:56:45 720 horsepower. 5.2 liter supercharged V8. Fantastic. It's kind of the Julius.AI of trucks. The AI data analyst. Connect your data, ask questions in plain English, get insights in seconds. No coding required. I'm trying to go through the timeline. We got Wilmanitis talking about some other stuff. Let's pull up this post from Pegasus, since it's relevant to the overview we just did. Pegasus says Altman today, of course he's talking about yesterday. We're looking at selling compute, but we need as much as possible.
Starting point is 00:57:20 Zach last week, we could sell compute. Are we in a compute shortage or not? Because both are saying they're buying as much of it as they can and thinking about selling it. That's very interesting. Yeah.
Starting point is 00:57:33 I mean, the supply and demand dynamics, I like that story about Jensen where he understands the dynamic here, but there's still, you get kind of crushed if there's a sell-off in terms of demand. Like, you just have to go back and study what was going on
Starting point is 00:57:49 during the crypto boom for NVIDIA. Like, during the crypto boom for NVIDIA, it was like buy as many as possible. Just ramp, ramp, ramp. And then they literally had a glut. Well, yeah, and I can... Completely had a glut, and they took huge right off. The steel man here, if you're a large tech company
Starting point is 00:58:07 and you have the ability to, and the financing capital to buy a lot of GPUs, it's not the worst idea to buy as much as you can, so that you have preferred access to it, and then resell some if you have more than you need, right? Yeah. But, yeah, the question is... Historically, that does not have been...
Starting point is 00:58:28 How things have been done? Like, I understand the pitch there, but, I mean, we just talked to David Bazuki from Roblox, and he was saying that, look, we had our own on-prem, but then we had spikes of demand, and so we went to the hyperscalers for that, because they can load balance across while people are playing Roblox here
Starting point is 00:58:51 and then maybe they watch some Netflix over there and they are storing all sorts of different data and there's different workloads that happen at different times. And so traditionally, like the hyper-scalers have been able to service like multiple users across them, jumping straight to selling compute. I think that the timelines are a little bit funky on this one. It seems odd.
Starting point is 00:59:18 It seems rushed, it seems rushed, especially when your, when your core product is growing so quickly. Like, Google Cloud Platform was released, what, seven years after Google launched? Same thing with AWS, same thing with Azure. Like, these were very mature businesses, very stable businesses with cash flow that were able to justify the investment. It was not something that was done as part in the growth phase as much. I don't know. Speaking of other wild, wild, you know, new business lines, Elon apparently confirmed that Tesla is going to build a semiconductor fab.
Starting point is 00:59:53 Yep. I've been advocating for this for a long time. I'd love to hear what Elon had to say. But first, let me tell you about Fall, the generative media platform for developers. The world's best generated image, video, and audio models all in one place, develop, and fine-tune models with serverless GPUs and on-demand clusters. Let's play at Elon. What do we got here?
Starting point is 01:00:14 Let's play Elon X. From our suppliers, it's still not enough. So I think we may have to do a Tesla TerraFab. Whoa. Great name. That is a bombshell. It's like giga, but way bigger. Terra, yeah.
Starting point is 01:00:36 Terabyte. He's feeling good right now. I can't see any other way to get to the volume of chips. that we're looking for. So I think we're probably going to have to build a gigantic chip app. Morris Chang and T.S.C. is like, no, actually, I'm fine. I will supply you. Samsung is like, I'm good. I will do it. I'm good for it. Pull up this other video of Optimus. Even when it's crazy. Because there was some new moves of Optimus getting, getting jiggy.
Starting point is 01:01:08 Let's see. It is at the bottom. of the Tyler app. Look at this. Let's see. Look at this. Okay. So this was shown in contrast to the first optimist,
Starting point is 01:01:20 which was just a guy in a suit. This thing has motion. Yeah, it's not bad. This feels like getting pretty close to unitary level. Definitely at that level. I don't know. Imagine you're working late
Starting point is 01:01:37 at the office one night and this thing just walks out onto the floor. and starts looking at you and doing these moves. Yeah, weird. It's so weird how these new projects, like, they seem... I understand why a lot of people look at this with skepticism, but at the same time, like, it just doesn't seem that complicated
Starting point is 01:01:59 to just manufacture that thing and ship it. Like, they do it at, you know, in China. They do it all over the place. This doesn't seem that insane. And at the same time, like, Apple couldn't ship a car. so like these new projects like they do kind of Elon's ship some cars though yeah yeah uh I I meant it more as like if you're doing one thing it can sometimes be hard to branch out into the other thing I remember a few weeks ago I don't know there was a headline around uh Tesla ordering like 600 million dollars worth
Starting point is 01:02:30 of actuators yeah like it seems like they're going into some sort of production yeah um I wonder what who they're going to you don't need 600 million for like test just for testing. It's just the car analogy is so tricky because it's like most people that bought Tesla's already had cars, right? And so it was just like a one-for-one swap. Who are you replacing with this? You know? It's tricky.
Starting point is 01:03:01 What do you think? Well, with this you can replace like an exotic dancer? Yes, that's true. I don't know if many people have that an exquisite. exotic dancer in their life, who they're ready to, who they're ready to replace. It seems, I don't know, it'll just be very interesting to see where these things actually diffuse. Like, because if you sell them, if you sell them for one price to consumers to do their laundry, but they work in an industrial capacity, like people will just buy them and use them for that.
Starting point is 01:03:33 But when we talk to people who are in industrial environments, they're like, I definitely don't want a humanoid robot for that. I'd rather just like a big, massive robotic arm that's like bolted to the ground and can actually lift like 10,000 pounds as opposed to one that can only lift like 50 pounds. I don't know. Ryan says they're going to sell them to the police. We'll see. Maybe.
Starting point is 01:03:55 They certainly would be, I don't know, backup. I think they would be pretty effective deterrence. Maybe, maybe. I like Vitorio having some fun on the timeline. Going viral, getting community noted. Torio said, Elon Musk now has $1 trillion in his bank account. That's a thousand times $1 billion. He could give every single human on earth $1 billion and still be left with $992 billion.
Starting point is 01:04:19 Let that sink in. People love this funny math whenever it drops. What else is going on? The humanoid. Yes, China has a similar humanoid robotic project, although it's way, way scarier. because it went full Terminator mode on this. Of course, this one is hooked up to power, but... Yes, so this video was in response to...
Starting point is 01:04:45 Earlier this week, there was a... I think it was Unitary, like, presentation, and they brought out this new robot, and it was walking with such, like... Swagger? Natural gate. Yeah, basically that people thought it was actually just a person in the suit.
Starting point is 01:04:56 And it was not. And so they replied with the... Wow. ...that is remarkable. Yeah. Oh, because they put it... They put, like, the suit on the robot? Yeah, it was like a Neo.
Starting point is 01:05:08 It was kind of like the 1X robot. We're getting spooky, spooky territory. This is pretty crazy. Oh, wow, yeah, it does look human. It does look human. I'm looking at the other video. Sorry. Let me tell you about TurboPuffer.
Starting point is 01:05:23 Search every byte. Service vector and full-text search. Builds from first principles and object stories. See these new logos, John? Cheaper. You see these new logos? Oh, wow. Stacked.
Starting point is 01:05:32 Stacked. Absolutely stacked. They're cooking. Chris Bakke says, in the last 10 months, three very talented friends have joined separate hot early stage startups in senior roles and quit after realizing that the company's actual revenue was significantly less than what the founder had told them during the interview process and shared online. Hmm. Few things to clarify. I'm not joking. In two of
Starting point is 01:05:56 three, in two of the three cases, it would be hard to tell that you were joining a bad actor company on the surface. Solid investors. Founders worked at unicorn companies. The third was maybe more obvious. The only ones who really lose here are the employees. The investors have 200 port codes and can afford some losses. And our industry doesn't really pursue founders over fraud slash misleading information up to a certain point. For if you're thinking about joining an early stage startup, ask to see their stripe and or signed contracts before you join. Not bad advice. Yeah, again, this is going back to spring, right, that just felt like there was this, like, every founder was feeling this insane pressure to show like one to 10 million.
Starting point is 01:06:36 of like a 1 to 10 million ramp that was just insane. Yeah. And the weird dynamic is that like as that pressure ramps up, you just get more and more incentive to fake it with community adjusted, ARR and contracts that don't actually stick and all sorts of different twists on something that's like, is it actually cash? Is it actually people coming in? Are they on long contracts?
Starting point is 01:07:04 The quality of revenue has been maybe degrading, but also just maybe in just, you know, it's just been easier to game than ever. There's been more incentive to game it than ever. So stay safe out there, folks. Kazakhstan has signed an MOU to buy up to two billion of advanced chips from NVIDIA. Let's hit the gong for Kazakhstan, warm it up, warm it up, warm it up, boom. Great hit. Great hit for our friends in Kazakhstan. Good to see them getting into the game. Does Borat take place in Kazakhstan?
Starting point is 01:07:45 Isn't that the whole thing? Maybe they should do Borat 3 about data centers. He goes to a data center. Well, Sasha Bear Cohen on data centers would be on the AI. Reprises his role, all to bring AI to Kazakhstan. Sasha Bear Cohen doing a doc, like a, like a, just on AI in general. Incredible. It'd be amazing.
Starting point is 01:08:05 So I can use it to do better web search? So I can use it as an auto-complete? You're not doing the bore-out voice, brother. I'm not going to do the borat voice. I will do the profound voice. Get your brand mentioned in chat, GPT. Reach millions of consumers who are using AI to discover new products and brands.
Starting point is 01:08:27 Go get a demo. The Kobe-S-E letter says, breaking Nvidia's losses, accelerate to negative 5% of the day, now down 16% since Monday's high. That marks a drop of $800 billion since Monday. Wow, that is a wild sell-off. Didn't Tyler quote this and say,
Starting point is 01:08:45 is this bullish or something like that? No, that was on the DoorDash. They went down 20%. Everyone's down 10 or 20%. We need a... Yeah, whether you beat or miss, you're going down. Wait, is this true. Raghav, Ari Emanuel, said he's working with Elon
Starting point is 01:08:59 to bring Optimus to the UFC? That would be incredible. be fake news. Please don't be fake news. We need robots in the UFC. I mean, Rogan and Paul Merlucky were talking about it on their podcast, and it certainly seems like a hilarious and wild thing, even just as an exhibition match before the real UFC, get an optimist. Though, here's a question. Would Elon want a human to thrash on Optimus? Is that good for his brand? Or would he want Optimus to win? And if Optimus wins, then is that actually entertaining?
Starting point is 01:09:36 I don't know. I think putting Optimus up against the best MMA fighters in the world is fine. It's impossible, though, right? Because if you just kick metal, you'll just break your foot. And if you punch metal. This was Ari Emanuel at the All-N Summit. Yeah. It was released, I guess, one day ago.
Starting point is 01:09:55 Oh, interesting. But he was talking about wanting to bring. Wait, wait, what a weird, what a crazy, crazy timing. That's so. He says, I saw what he's creating, the man's a genius. I want to do a UFC fight with his robots. Yeah, yeah. Very cool.
Starting point is 01:10:10 Heisenberg, sharing a little bit of red here. Microsoft down 10% in the last eight days. Nvidia at 12% in the last four days. Allentier at 16%, down 16%. Meta, down 18%. Losses have accelerated. Compound 248. was sharing yesterday, he's like,
Starting point is 01:10:33 when I clip this, I didn't expect to crash the markets globally, but he certainly that clip
Starting point is 01:10:46 seems to have played a part in this little sell-off. Always take a victory lap for having world historic consequences. I actually think that that particular clip was like less than 5% of the total views on that clip. because there were other people that clipped it
Starting point is 01:11:02 and it got a lot of views. Oh, for sure. You went everywhere. But always claim victory. That's the lesson. That's right. Anyway, we have our first guest of the show, Catherine Boyle
Starting point is 01:11:10 from Andrews and Horowitz, joining us. She's in the re-stream waiting room. Welcome to the TV channel. Catherine, how are you doing? It's great to be here. Good to see you guys. Why do you play the night vision
Starting point is 01:11:20 goggle sound? You already got new soundboard today. I felt a little tactical. I felt a little tactical. It's great to see you. Good to see you. Should the Department of War get a soundboard
Starting point is 01:11:30 during these speeches? I heard Hengseth was given a big speech. Are they sticking mostly to the cheers and the clapping or are we doing firing off muskets? Are we firing off cannons? Is there pageantry? Is pageantry
Starting point is 01:11:46 alive in the Department of War speech? Bring the pageantry back to the Department of War. That is important. You guys should definitely have your own sound for them. For sure. You guys are getting the first look at a major speech. Yes. So it is just ending right now. I just hopped off the live stream. And I can tell you, my phone is blowing up with
Starting point is 01:12:06 just how many people are enthusiastic, not only in the venture community, but just across Washington about what has been said in this speech. And it's going to have dramatic repercussions for defense tech and for Silicon Valley and the broader defense innovation base. So it's, it's extraordinary speech. You'll be hearing a lot more about it. But you're getting the first look. Fantastic. Take us through it. And it is, it is definitely, I'd say, you know, for the last say, 10 years or so, we've been talking about defense reform, why it is so important for increasing competition, for startups. And what is amazing about this speech is that it's not just lip service to, yes, of course, we need competition, basically what you've been hearing in
Starting point is 01:12:44 Washington for the last 10 years. It goes line by line. HEC Seth actually made a very, very funny comment. He said, if you're watching this on Fox News, your eyes are glazing over because this is the most boring thing you've ever heard. But for everyone in the room from the Andrews, the the Palantiers, the SpaceX's of the world, all the way down to the little guys to the primes, it was like Christmas come early, right? Because it's, it really is changing the way that acquisitions are going to happen inside the department in ways that are so meaningful, not only for Silicon Valley companies, but for private equity back companies and for the primes. He basically told the primes in the very beginning, you have to be investing more in research
Starting point is 01:13:22 and development. The companies that have been getting the large programs cannot just continue as business used to be done. He actually quoted Donald Rumsfeld. Didn't tell us he was quoting Donald Rumsfeld for the first five minutes and basically went line by line through his speech that he gave on September 10th, 2001,
Starting point is 01:13:39 about how important it is to change the defense acquisition process and he said, you probably don't recognize this, but the speech I am giving now is actually Rumsfelds. Wait, September 10th? Yes, yes. The day before just happened to be...
Starting point is 01:13:52 Happened to be the day before where he said, we have to change the equity process. That is a crazy coincidence. Very crazy coincidence, but it shows you just how long this has been going on. Sure, sure, sure. So I'm happy. I have my notes here because I was taking Kobe's notes throughout the speech. Yeah, let's go line by line.
Starting point is 01:14:11 Let's let startups know what's going to be changing. Yes, please. But the first thing that I think is really important for all of the startups in Silicon Valley is that he mentioned and called out the importance of commercial first technology. So this is going to shock people who are not in the defense world. that right now, if you need to build a product specifically for the Department of Defense, the vast majority of contracts that are given out are not given out with a purview of what exists already in the marketplace that we can buy.
Starting point is 01:14:37 Yes. People, you know, it's a requirements process, a very long 300-day and, you know, requirements process, actually, that he called out that requires the different program offices to say what they need, to write it down, and then to find people who will build explicitly for those requirements. They are doing away with the old requirements process and saying this is not what we're going to do anymore. We are going to focus on commercial first technologies, even if there is a technology out there that is only 85% of what we need. We will buy it. That is the goal. And then we will turn it into what we need to for actually for the warfighters. This is Cots versus
Starting point is 01:15:14 gots for the DOD, DOW, nerds, like commercial off the shelf versus government off the shelf. This is what Anderol and all the defense tech community has been beating the drum on for basically a decade now. And it feels like it's finally getting echoed back from the administration. Is that correct? Yes. And there was a lot. The administration in the EO that was out a couple months ago said that they wanted commercial first technology. Now you're hearing it straight from the secretary's mouth. There's always dust steps about this on Capitol Hill with, with, you know, various parties arguing over whether we need more studies or whether we should just do this. It looks like now this is real change that's about to happen, and it's going to be transformative for all of the
Starting point is 01:15:55 startup community. What it really says is that the best products will win. Yeah, who are the losers here? Well, I think, I mean, the major loser from this speech is the Primes. I mean, he said, I don't want to pick on anyone. I'm not going to call out companies, but you have to be investing in research and technology. Like, you have to be, the fact that the Primes only spend 2% on research and development and are focused on, you know, constantly kind of maintaining the status quo, that is not going to happen on his watch. I think it was a... 2% on R&D.
Starting point is 01:16:26 Yes. And haven't historically been acquiring venture-backed companies. So this is also the problem where they're really just on a much slower timeline than is needed by the department. And I'd say that the takeaway from the speech, and I think the soundbite you'll be hearing more and more about is that the new process is really going to be used to enable speed and volume. And one of the things that I think, you know, a number of. of companies have been talking about for a long time is that by taking more risk in the acquisition
Starting point is 01:16:55 process, you take less risk on the battlefield. And he actually echoed that. He said, we want to take more risk. We want to have, we want to reward the people inside of the department who are making risky decisions or supposedly risky decisions because they're acquiring new technologies or from companies that are that are not Raytheon and the kind of big guys, right? But what they, what they ultimately said is like they want those, the people who are going to be making this acquisition decisions to be taking on the risk because it takes less it will affect the risk of the warfighter right it will make things safer for the warfighter so it's it's basically linking this entire process bringing it back to to to what needs to be done um before war start and before
Starting point is 01:17:36 products get into the hands of the this is like a a top down directive and then who are the players that need to actually put this into like who are the kind of key players that need to put this into practice yeah so I mean like the the the main thing that I this is definitely a message sent to Congress because there's, there's currently, you know, two different bills going through on Congress that are actually pretty aligned in most of the things that they're saying. There's a few callouts where I think you'll see sort of some sort of back and forth between the House and the Senate. But this is definitely a call to them. Like this is happening on the Secretary's watch. This is something that he cares deeply about. This is something
Starting point is 01:18:11 that the administration cares about. And so it definitely is a huge push, not just a, not just a major signal, but they sort of outlined this is how we're going to change things inside the department. One of the biggest things that they do is they created this sort of portfolio-based approach of how they're going to be allocating capital to companies now. And this is always shocking when I explain to people that this is revolutionary because you'd say in a normal company, you're given a budget. And if the vendor you choose fails, you'll go to another vendor and you'll use that capital in order to buy whatever product you need. That is not how the department works. And so with this new portfolio-based approach, they're basically saying we're going
Starting point is 01:18:47 to give what's known as the PAE's, the portfolio acquisition executives, the right to say, okay, if this one product that we acquired 60 days ago isn't working on the battlefield, we can use the rest of the money and reprogram it for a product that is working. And right now, that is not possible inside the Department of Defense. Right now, you have to write a whole another list of requirements in ways that you would build a new product in theory that can take years to actually build. And by the time it's built, it isn't meeting the requirements or the needs of the warfighter. So the fact that they're even changing the budgeting process and that they have an explicit call out of how they're going to do that, that will show up in this year's
Starting point is 01:19:24 NDAA once it's passed, and it will be the way that things are done from now on. So this is extraordinary. Yeah, so this means that if a contract is allocated to a certain company and they start underperforming in real time, the person who's in charge of allocating that budget can basically say, hey, this other startup, like we're going to like rotate this budget over here because they're actually going to deliver on this. I've heard some. Yeah, I've heard some horror stories and even opportunities where some of the bigger primes have just been, have just held on to contracts forever. They're kind of like barely delivering. They're kind of like trying to deliver and startups coming in and saying like, hey, we'll actually take this over. And sometimes those deals can happen.
Starting point is 01:20:12 but this seems like it'll be a lot more effective. Are there any new categories that are opening up generally? I mean, we've heard this week about like data centers in space. So if you're tracking like data centers or space, like that's a story. But I remember Secretary of the Army was talking about modularity of weaponry, taking a part of drone, fixing it. I feel like the whole idea of drones or autonomous submarines, these are new categories that sort of opened up.
Starting point is 01:20:41 there have been a number of competitors, but at the thematic, like, within defense technology level, are there any categories that you think this new administration is maybe more excited about than ever before? Yeah. So, I mean, you mentioned modularity and what the Secretary of the Army is talking about. I mean, this is, this was actually called out in Secretary Hegs speech where he said, part of the reason why we need these PAEs and we're formerly called PEOs, but PAs, these portfolio-based approaches, is because right now, Now, if you want to take something apart and put it back together, you have to build an entirely different system to acquire those parts.
Starting point is 01:21:17 And you have to go through a requirements process. So if you're building modular-based approaches that update in real-time, and he actually said, like, we want it to be more like software. We want to do software updates in real-time and not have to go through a requirements-based approach. You need to be thinking about things in terms of modularity. So I think there's a lot of different, even just changing the way the budgeting works, it opens up a lot of different technologies that were not able to participate in the acquisition system because they didn't want to go through the requirements process or they knew they wouldn't
Starting point is 01:21:51 meet the requirements that are being put out by the Department of War. So it is going to radically change, I think, not only the types of products you're seeing in the hands of the warfighter, but just how quickly they get there. Makes a lot of sense. Switching gears a little bit. Sam Altman suggested on Wednesday this idea of like my my interpretation of it was kind of like a go co model for AI factories without commenting on on that specifically can you give us the history of go co's when and how they work well and what kind of some of the challenges have been well I actually didn't see what can you give me a little more context and what yeah so yeah so I'll read exactly what he said. So this was in response to clarifying the sort of backstop gate.
Starting point is 01:22:43 He said, so anyways, he basically said, obvious one, we do not have or want government guarantees for open AI data centers. And then, like, later in the post, he said, what we do think might make sense is governments building and owning their own AI infrastructure. And then the upside of that should flow to the government as well. We can imagine a world where governments decide to off take a lot of computing power and get to decide how to use it, and it may make sense to provide lower cost of capital to do so. Building a strategic national reserve of computing power makes a lot of sense, but this should be for the government's benefit, not the benefit of private companies. And just in some of the ideas around this just reminded me of how Gokos have worked
Starting point is 01:23:22 in defense tech where the government is basically saying, like, we have, we're willing to commit resources and land to like a certain initiative, and then you can have the private market potentially pay a role. And I have no idea how this would look like in the AI context, but I just thought it would be helpful to understand how these sort of like public-private partnerships have worked in the past, specifically in defense. Yeah, well, no, I can talk to one that was actually discussed today. So the Secretary of the Army was on Squawk Box this morning talking about specifically, if we are giving the land and security for massive data centers, could the Army have its own data centers and be one of the largest providers and actually offtake that compute.
Starting point is 01:24:05 So I do think that there are ongoing conversations. Candidly, I haven't seen, you know, massive conversations about how it's going to be enacted by the Department of War. But I thought it was very interesting that the Secretary of the Army was basically saying that this is something that they would be very interested in, that they know that they are going to need their own compute and that if there is sort of a tradeoff of we provide the land and the security, there should be an understanding that that compute then goes to to the Army. So I don't know that they fully flesh out the details of those sorts of things,
Starting point is 01:24:35 but I do think there's a lot of conversations about this. And it's good to hear that Sam is also talking about that in certain ways. But I do think for the Department of War, they know that it's top of mind. And I think there being a lot more experimental, which again comes back to if you have a system of requirements that is far more sort of open, I'd say, to experimentation or working with new companies that can do that, it does give the secretaries of the Army. the Navy a lot more authority to make those decisions on kind of the actual things like compute or energy that they need. Yeah. How do you think about the flow between like entrepreneurial ideas, like problems that are identified in the private sector,
Starting point is 01:25:16 somebody starts a company? Like with Anderil is like the flipped model, this idea that R&D should maybe live outside of the taxpayer dime, be funded by venture capitalists. That company gets built. Eventually those ideas are percolating into D.C., and then the world updates, America updates, we change the way we work, things get better, but Anderol benefits, and people are pointing the finger, you know, like, hey, you're the one that advocated for this. And I feel like that's kind of where Sam's caught right now, where Open AI is advocating for things that I think are good for America, but also might benefit Open AI. And so he has to do this delicate dance of like, well, I was independent on this one,
Starting point is 01:26:00 but this one does benefit me, et cetera, et cetera. It's like such a communications challenge. Do you think it's like, do you have any best practices for like how to not get over your skis on that? Well, I think it's, I think as you said, it's sort of a typical Washington story where in order for things to change, you do have to sort of, you know, kind of Washington's a meme town, just as Silicon Valley is a meme town. Like you have to be able to tell your story why it's so important.
Starting point is 01:26:25 I would argue that Andrew L was exceptional at really telling the story of why. why you need attritable systems, why you need modular systems, basically, the kind of, you know, the kind of frameworks that we're using today inside the Department of War came from those memes being established. But yes, if you're, if you're successful inside of Washington, you're going to tick off a lot of different people. Totally. And you're going to certainly upset sort of the bureaucratic class. And what was very interesting, too, about this speech is that the secretary actually called out the bureaucratic class and said, like, this is not a problem with people that we have. This is not a problem with innovation or technology. It is the bureaucracy that is stopping us from making great decisions.
Starting point is 01:27:06 And I think, like, that is something that is, you know, always been the problem with D.C. And I think, you know, the most important thing that companies can always do is just continue saying what they believe, going to Washington, building those relationships. And I think, if anything, over the last 10 years, this defense acquisition reform, which everyone said was impossible, is now actually at our fingertips and going to happen because it makes sense. the argument has been made. It's been made in dozens of different ways across many administrations and everyone's an agreement on it now. So for founders that feel like, oh gosh, like, look at these great companies that are getting backlash because they've, you know, kind of spread their own gospel and now they're being attacked for it. Like, that's actually, I think, a sign of progress. If you're being attacked for what you're advocating in Washington, that's usually a good thing.
Starting point is 01:27:54 Yeah, no, that makes a lot of sense. Final question. What is your most up-to-date guidance for your portfolio companies around the shutdown. Yeah, no. So, I mean, it definitely is having an impact, right? Like the companies that were expecting contracts to come in or reprogramming dollars, like they're not going to necessarily see it on the timelines that they thought maybe, you know, maybe a quarter ago. That said, I do think that a lot of these provisions that are happening right now are so transformative
Starting point is 01:28:24 that in the long run, like the kind of if we look back on what happened in the year 2025, it will have been a very, very good year for startups. Maybe their numbers will lag by a quarter just because, okay, like the funds can't be delivered, depending on how long the shutdown goes down. I think it's having a greater impact actually on the citizen, right? Like, I mean, what's happening with air traffic control, what's happening, you know, across, you know, civic benefits. Like that is something that I think we're now all feeling the pain of,
Starting point is 01:28:49 or people who are dependent on those benefits are certainly feeling the pain of, or people who are traveling or planning to travel for upcoming Thanksgiving are certainly going to feel the pain of the shutdown. But it's certainly having an impact on companies. And I think most companies are still working exceptionally hard on the things that they can work hard on now, even if the contracts can't get signed or the press releases can't be put out or the dollars aren't coming in yet. But I think when we look back this year, the story of this year will actually be the defense acquisition reform and what's happening for these companies in the long term versus sort of the short term.
Starting point is 01:29:22 Yeah, short term pain, long term gain. Yeah. Awesome. Well, thank you so much for. taking notes and giving us a full breakdown. Thanks so much. Always great to catch you. We'll talk to you soon.
Starting point is 01:29:32 Thanks so much. Have a good one. Let me tell you about linear. Linear is a purpose-built tool for planning and building products. Meet the system for modern software development, streamline issues, projects, and product roadmaps and start building. And we have our next guest, Mikey Schulman from Suno in the Restream Waiting Room. Here he is.
Starting point is 01:29:50 Welcome to the TPPN Ultrodome. How are you doing? It's great to be here. I'm doing great. How are you guys? Fantastic. Fantastic to have you. I'm very excited about this.
Starting point is 01:29:58 I have a million questions, and I know that we're just going to go super deep into all the details of user behavior, but kick us off with just a general high-level introduction on you and the company, and then we'll go from there. Cool. Yeah, great to be here. I don't know. I'm not so interesting. The company is much more interesting. The way I like to think of it is we are delivering the best, most valuable digital musical experiences to the whole world. And right now, the things that are available to the end user are not always amazing. They're not always differentiated. And I think that's just like a failure of imagination that you can do so much more with music and you can enjoy it so much more. That's how I think about us. What was the first song that you ever made? Ever made?
Starting point is 01:30:45 Yeah. I've been playing piano since I'm four. I played in a lot of bands in high school and in college. I play every day. So I don't actually know the answer to that, but it's probably like chopsticks or something. Are you a foundation model company? or an application layer company or both? Look, the answer is both.
Starting point is 01:31:03 You know, all the technologies are ours. These didn't exist. But I think about, and maybe this will color the conversation, and certainly, you know, when you guys were discussing us last week or two weeks ago, all that matters here is the product. All that matters is that we deliver an experience to a user that makes them feel a certain way. And so at the end of the day, like that, like, if there were a way to do what we're doing without AI, like we would probably do that there just isn't yeah yeah yeah so it in in so there is a wind
Starting point is 01:31:33 condition where you don't even train models and you use someone else's models but as long as you have the best application like that's where the value occurs I think so like at the end of yeah users don't care and I think probably at some point um maybe there's like a last model we officially release yeah yeah um and the rest is just like amazing product updates no no I completely really good. That makes sense. Yeah, that makes a lot of sense. Walk us through. We normally don't talk about the history of companies too much during interviews, but because you're, I mean, this is the first time on the show. Like, I'd love to kind of understand the different inflection points for the business. I mean, you guys
Starting point is 01:32:11 just grown tremendously over the last, over the last year. So walk, yeah, walk us through kind of the history of Suno and then kind of the history of even AI music and how those two things track? Sure. We launched this product a little over two years ago. I think it was in September and it was kind of barely passable back then. It was a Discord bot. This is actually something I got wrong completely. Thankfully, I have good co-founders who disabused me of my bad ideas. Was that like a mid-Journey? Was that a mid-Journey? Were you inspired by Mid-Journey? Yep. It was working for Mid-Journey. It was working so well for Mid-Journey. And I thought we'd be on Discord forever. And then it was like around Thanksgiving.
Starting point is 01:32:53 we released of that year, we released like a really thin web app. It didn't even have the full functionality of the Discord bot. And five days for 90% of the traffic to move over to the web. It's like, so if there's market signal that I got something wrong, it was like very, very strong. And I think slowly but truly, you know, we keep releasing new products, new models. The product gets more engaging.
Starting point is 01:33:16 The music gets more impactful. And slowly but truly, I would say the cartoon you should have is it becomes more and more appealing to a, a larger and larger group of people every time you make the product better. And so it's basically been a lot of iteration. Yes, there are inflection points. Everything consumer is like a little bit lumpy, but that's the history of it. Yeah.
Starting point is 01:33:37 What did the early cohort of users look like? How were they using the product? Like, why were they? I'm assuming they were, you know, you had, I'm sure you had a cult kind of cult following, kind of early. What did that early cohort look like? And is it that much different than today? or is it just scaled up?
Starting point is 01:33:56 It's very different today. So the early cohort, I would describe as like very tech forward and music curious. And it's like, here's this new thing. You kind of have to suspend disbelief. You need to be really forgiving because the music actually isn't all that good. And as you make the product better,
Starting point is 01:34:15 you grow out from there and now. It's like people who love music who are like a little bit tech curious. And actually at this point, you don't even need to be all that tech curious. there's like a fairly easy-to-use experience on both web and mobile that you can just kind of dive into. And so I think the early users are indeed like, you know, if you were on Discord, when you put it this way, if you were on Discord, like you are a very different kind of user than the type of person who might find our mobile app. Yeah, yeah, exactly, exactly.
Starting point is 01:34:43 Yeah. And so today, I think like part of the debate that we were having and you can deliver some truth around this is like, what like what are kind of the key cohorts that are using and loving Suno everyone by now has probably seen you know viral short form videos where people are combining two different songs and and some of the reach on on these assets is insane you'll see you know somebody combining like uh rap artists like future with jazz music and and you know getting you know hundreds uh tens of millions or you know potentially even more uh views on this type of content but then we've also heard of stories of professional musicians using this to like accelerate their workflows and then we've now heard stories of people that are just using creating music just for
Starting point is 01:35:32 personal consumption and so what it what are the different kind of cohorts look like and what what are you guys what kind of user are you guys most focused on so the professional content creator be their professional music creator or other content creator this is like mid single digit percentage of our user base and And the vast majority of people are people who love music, but aren't making music for any other purpose. That's just, we call it creative entertainment. This is just like a new behavior. And so I think trying to apply too much of what you know from other things, be it consumer social platforms or video games, it's always, you always just have to take everything with a grain of salt.
Starting point is 01:36:13 But most people, they come and they find that music making is incredibly enjoyable. And if you have studied an instrument and made music with your friends, like you probably already know this. And you just don't believe that it can be done digitally. And then you kind of find the product and you realize, oh, my God, this can be done digitally. I'm going to enjoy the hell out of making music. I usually try to cartoonize our average user as people who would say,
Starting point is 01:36:37 I love music. I love to sing. You don't love it when I sing. That's kind of our sweet spot. That makes sense. And that tracks on like actual monetization. It's not like those are the free users. and then all the revenue comes from professional people or like API.
Starting point is 01:36:53 Like it actually matches roughly what you described. That's right. And so I, you know, you guys mentioned a couple weeks ago the Chris Dixon thing. They're like, come for the tool, stay for the network. I think about it a little differently for us. I kind of think of it for us as like come for the gimmick and stay for the joy. And this is like the party trick of making that song that even if it's not, you know, your favorite artist, which you largely can't platform
Starting point is 01:37:20 but it's, you know, make the country song about Debbie being late. That's the party trick but there's something that is behind the party trick that you get pulled into and all of a sudden you are a music maker and you're enjoying it. And so I actually, I don't think it's bad that
Starting point is 01:37:34 you know, there's like this song that is meant to be consumed once that you're never going to consume again that like you laugh for four seconds with your friend. I think one of you said like I almost just need to know the prompt and then I don't need to to know the rest of the song. I don't actually think that's bad because there's this amazing experience that it gets backed up with that people spend hours a day on.
Starting point is 01:37:54 Totally. Yeah. And when I think about some of the most fun moments of my childhood, it was my dad writing and playing a song for guitar and singing it one time and never singing it again. And I still like just remember the moments and the joy of that. And so like the concept of like ephemeral songs that can be created for a moment that historically you would have needed, you know, studio time you would have needed to spend all this time like writing the song you know producing it all this stuff and now it can be done in an instant i think uh it really is it really is totally magical where where i think this is similar to this is similar to this like there's ephemeral pictures on your phone right and you never go look at them again unless apple tells you
Starting point is 01:38:35 yeah yeah where where are the rough edges or where are the edges that you want to be like careful around i feel like uh with chat gpte no one was really expecting the whole like getting one shoted and psychosis and like delusions of grandeur like that came out of nowhere in my opinion and then all of a sudden it was like an issue and they had to work on it and of course there's like guardrails that they can put up uh mid journey i think a lot of people use that as art therapy i haven't seen that many people or heard that many anecdotes about people like going crazy because of mid journey but they're still like if you're doing image generation you don't want to generate adult content for example uh do you have an idea of like the shape of the battle that you're going to be waging
Starting point is 01:39:16 against like the like the risks that you want to like fight off yeah look i think for us they're actually just fairly lower stakes than those other ones like sorry i'm turning anyone into paper clips and not that i think that that's a real thing anyway and i like you said i don't think we're we're inducing psychosis in anyone um you know this is a tricky question music um the the content has kind of always been filthy in in the best music and sure um i don't really want us to be the arbiter of that. So we have like some content moderation that's like pretty, I would say pretty forgiving. And currently that's actually, that's it. And then lots of copyright moderation, of course. Yeah. I was, I was raised on Little John and the East Side Boys. And it was indeed
Starting point is 01:40:03 filthy. Yeah. And yes, it would not be your place to necessarily filter that. I would imagine that you at some point have to do some sort of like KYC at age verification or even something that's just like parental controls, that type of stuff. That certainly makes sense. I have a follow-up question, but... Go for it. Yeah, there was an article in the journal yesterday about how Instagram had used PG-13 as a heuristic
Starting point is 01:40:33 for the type of content that they would show to teenagers on Instagram. And the MPA actually pushed back against that. And I was very upset about that because I feel like we need shared language. for how we describe what is acceptable on new tech platforms. So, like, I want you to be able to put the label of, like, explicit content. Remember that explicit content on every album? It became so iconic, it was actually cool, which maybe had a bad effect. But if you put that, if you put that black and white flag with the explicit on there,
Starting point is 01:41:06 I know exactly what I'm getting. And I know that I'm going to keep that out of my hands of my kids for a number of years. and I would hate for I would hate to be in a situation where you couldn't use that and you had to come up with some new jargon for your policy I love just building on the shoulders of giants but have you thought about adapting any of those terminologies or just making your moderation language match what people already know is that relevant to you not as much as it is on other platforms I mean but it is you know I feel like um we should not be the arbiters of a lot of these things. And, you know, quite frankly, maybe, like, you don't want us to be the arbiters of these things and that there should be other guidelines set by other things. And so, like, yes, if you're dealing with children,
Starting point is 01:41:56 you're going to have different roles about the songs that you can make and that you can not make. And actually, right now, parents making songs with their kids is actually a mega use case. It's probably half of my usage, which is a lot. I use the product a lot. And so it's like, right now this is actually not the biggest focus for us. like what we have, like, works pretty well.
Starting point is 01:42:16 There are always going to be edge cases. There are edge cases in moderation on every single platform, literally, you know, ever. And we kind of kind of through them as they come. Yeah. What are your conversations like with people in the music industry? There was some news earlier this week around the first, like, AI artists signed a record deal. I don't know if it was actually the first, but that's how it was being reported. How are the conversations are going with your users that are, like, true?
Starting point is 01:42:43 pro users that are using Suno to either like, you know, experiment with like a bunch of different variations of songs, get new ideas, make entire songs themselves. What are those conversations like? You know, I think there's been a huge shift in the last six months. I would say like now, I really don't meet a lot of professionals who don't use Suno at least a little bit. Like songwriters use this a ton. We run a lot of camps. It's cliche, but a creative companion or a a creative co-pilot for helping craft the right songs, you know, verse by verse or getting ideas. Basically, every producer I know uses Suno. It was recently put to me just in an incredibly pithy way of we've become the ozempic of the music industry. Everybody's on it, but nobody wants
Starting point is 01:43:29 to talk about it. And while that might be frustrating for us that in the interim, nobody wants to talk about it. In the long run, that's actually great. In the long run, the fact that everybody is using it and loves it is fantastic. And so it's this weird dynamic, I think, where, one-on-one people are quite pro even though like the industry as a whole may have different feelings about it. It's interesting. Yeah, I wonder, I wonder how this like, like the creative, the creative element of this like really throws me because, uh, on, on the one hand, I see if, if I'm thinking about Suno in the creative realm as like, you know, cursor for your, uh, for your, uh, for your, your, your workstation, your, your, your, your, your,
Starting point is 01:44:13 fruity loops or your pro tools or your Ableton and you're generating samples and actually working through that I feel like that is something that yes we'd get to 100% penetration but on the flip side like we've had the ability to generate blog posts and text for years now at human touring test level and and yet like we're it does not feel like we're close to just oh yeah just put in a prompt write a good blog post and it comes out and it's Like, you still need that spark, that inspiration, and then most people, I feel like are still not, they're, they're still in the co-pilot era for sure. And I feel like that that's interesting. Yeah, I don't know.
Starting point is 01:44:58 I think there's a big difference, though, here, which is that there's a few big differences. But for me, you know, hearing you say that, it makes me think a lot of, you're thinking of us largely as a tool. Sure. It's because a lot of AI products are tools. And the point is not to just turn this blog post out. Actually, the point is to have enjoyed the process of it. I happen not to really enjoy the process of writing so much. Maybe you guys are different.
Starting point is 01:45:20 But for us, the way people use Suno, it's like you're coming. Yes, there is some song at the end of it that you listen to and the fruits of your labor are enjoyable. But the journey of doing it is actually why you're there. And so the actual tool analogy doesn't really work there of like, I don't just write in the prompt and it comes out perfect. That's not even the point here. Yes. So I think where I'm getting is that the actual market for AI audio is going to bifurcate, in my opinion, and there will be people just like there are people right now that use LLMs to just chat and just have an AI therapist or they have a friend or they just ask chat GPT 40 for, you know, hey, what should I do at work about this? They're just having a conversation.
Starting point is 01:46:08 And then out of text generation, out of LLMs, we also got just the ability to do code completion. And so we got, you know, windsurf and cursor and codex. And, you know, we got these, like, coding agents. And then we also got, like, AI therapists. And these are two wildly different things from the same underlying technology. And I feel like I would just be shocked if you don't see both emerge over time. I mean, maybe the different company comes in and takes one more seriously than you. you because of the market's really big. But I would, I would imagine that you wind up servicing both to
Starting point is 01:46:42 some degree, no? For sure. I mean, we already service both. But, you know, I almost think of this as in the coding example, you might vibe code something. Yeah. And that's just a few prompts. Yeah. Or you might actually need fine-grained control over the thing. And then you use a different tool, like something that's basically auto-complete on steroids. Yes. I don't mean that in a majority of ways. It's like amazing. No, no. 100%. Music is the same way. I might vibe, create something and have a session where, the music is generally what I'm looking for and I enjoyed the process of getting there
Starting point is 01:47:12 but maybe that's the end of it and if this is going to be a song that is everlasting and intended to top the charts I need that fine green control and I need a different paradigm and a different set of workflows and tools for doing it and so there may even be the same underlying model that powers both of those things
Starting point is 01:47:29 and it's a different end user application Yeah how do you think like video creation is distinctly different than like music creation because I think you know on our side watching the SORA launch SORA was like a cool creative tool that had this novel like feature that was the cameo feature but then immediately those outputs just wanted to go live elsewhere right they wanted to go where these big audiences were and I feel like with Suno the difference is like you
Starting point is 01:48:02 can create music and you want to save stuff and it's actually can be very enjoyable to just listen to the content, is that, is that just a, like, do you think that will, do you think that will always be kind of the dynamic or do you think eventually like video models will catch up to the point where like people are generating videos and then just like consuming them a lot? Because I haven't, again, like, I think a lot of the output of something like Asora has been like very ephemeral. It gets maybe shared in a group chat or gets shared on another social media platform but people are not like revisiting the content yeah i think look very fast moving space and so you know this is just one guy's opinion today um i think there's basically two big differences
Starting point is 01:48:48 one is um the joy in making the music and i think people mostly enjoy consuming the saura outputs and the the experience of making it is is less interesting and then the other thing about video in general but especially AI video is it really tends to short form and actually the beauty of music is that it really is meant to grab you for minutes at a time and not seconds at a time. And it makes for a completely different consumptive experience. And so, and I think that's actually a lot of what's broken in music consumption today is there's so much pressure to put on a song and then put, you know, through headphones and then put your phone in your pocket and not even really pay attention to it. But I think a lot more is possible where I do have somebody's attention
Starting point is 01:49:35 for three minutes, four minutes, and you can actually tell the story through music in a way that, like, a 30-second video really does not. And so maybe AI video will get there. Maybe it will get there in a year. Maybe it'll get there in five years. But, like, today, that's what I see as the two big differences. Makes sense.
Starting point is 01:49:50 How do you think the music streaming platforms will navigate AI music? I think they, look, they already are. There's AI music on these platforms. Spotify's announced that they'll start doing AI music And so I very strongly suspect that in like five years, this is not a conversation that's being had because I already know that there's little bits of our music everywhere
Starting point is 01:50:17 in, you know, mega smashes. And the same way you don't think about, like, was this song digitally produced or was it produced on a giant console in a studio somewhere? And that's maybe something that people thought for five years, you know, 30 years ago. My very strong suspicion, or just like, like auto tune where now every song has auto tune in it, even if you can't hear it. My very
Starting point is 01:50:39 strong suspicion is that that is where everything ends up. And the reason is it is better for consumers. It would really be bad for consumers and therefore bad for the size of the ecosystem if these things kind of bifurcate. Yep. That makes a lot of sense. It's like asking, I don't even think anyone ever asked the question, how will social media apps handle Photoshop? Photoshop was Photoshop outputs were just immediately the tool. existed prior to social media, so it just proliferated immediately. I mean, speaking of Photoshop, Photoshop has had to grapple with the existence of generative images through partnerships, through rolling their own. How important is it to you to play in the market
Starting point is 01:51:25 of like the enterprise audio workflows, do deals, partnerships? Because what we saw in the text, in the, at least in code at least. Every coding platform and there were a whole bunch of new ones, like there's been a real battle for the IDE and I'm wondering if there's like a battle coming for the
Starting point is 01:51:45 top end of the music production workflow. I think there probably will be a battle, but it will be not nearly the bloodbath that it will be in other domains. And it's just because the average piece of music today is produced
Starting point is 01:52:02 in 800 different tools. And so, and everybody has all of the tools. And so I actually don't think it's entirely reasonable to think that there will be one tool that uses up all of your screen time. And like, that's going to be the thing that you produce 99% of your music in, if you are a true professional. And so there, the end goal is the perfect song and a little bit less the journey of how you got there.
Starting point is 01:52:25 Yeah. Interesting. In the code, it's like VS code is just open all the time. And I don't want to have another editor. And if you, like, talk to people who develop iOS, they're pissed off because they have two editors. But it's not nearly the same 800 different tools that it is in music. Yeah. Yeah, no, that makes it sense.
Starting point is 01:52:42 This is super interesting. Thank you for coming on to break it down. Yeah, this is awesome. Thanks so much. I'm going to use Suno to try to make bedtime with my three-and-a-half-year-old a little bit easier later tonight. So I'll report back. I'm excited to hear. What are some of your favorite prompts that you use with your kid or kids?
Starting point is 01:53:06 You know, a lot of them are songs. I have a one-year-old and a four-year-old. It's like still too early for the one-year-old. A lot of them are songs about him in fantastical situations. Like my most favorite one is just like last winter after the Zamboni driver at the local ice rink, let him ride in the Zamboni. And so it was like a whole album of songs about him riding in a Zamboni doing all sorts of crazy stuff. Um, and yeah, they're like very special and personal. And I do go back to those. And, and, um, gosh, yeah, I think this is wonderful. Sometimes we joke if we were a kids app, we'd have five times the revenue that we do. You know, it's just like, um, kids is like the most straightforward way to get me to part with my money. I'm sure you're the same way. Totally, totally agree. Totally agree with. Totally agree with. Yeah. So great to meet you, Mikey. Thank you for going on. And congratulations. We'll talk you soon. Cheers. Bye.
Starting point is 01:54:00 let me tell you about numeral.com sales tax on autopilot spend less than five minutes per month on sales tax compliance speaking of kids so nice to drop the hq yeah drop the hq rip to general hq speaking of kids i want your review of this way to decide how to buy a house so in new york city in 2023 this real estate agent worked with a couple who wanted to purchase an apartment on the upper west side of Manhattan, but they wanted their Pokemon-obsessed six-year-old son to have a say. How? By playing Pokemon in each neighborhood before his parents would commit to a purchase. We spent an afternoon covering four neighborhoods as the child played the game and caught Pokemon characters. It turns out that Central Park West was a hot spot for Pokemon Go characters that day,
Starting point is 01:54:59 and we toured a two-bedroom co-op that was listed for about $2 million. The child ended up catching a snorlax as soon as we stepped inside Central Park. A snorlax is a blue-green character that looks like a bear, and is apparently extremely rare to find because it spends the majority of its life asleep.
Starting point is 01:55:16 The child is screaming, jumping, up and down. He was so excited to find it. Although my clients looked at seven other apartments that day, no other area came close in terms of Pokemon action, so they knew that apartment was the one. They rode up an offer that same day for the full asking price. What's your review? Is that a good way?
Starting point is 01:55:34 Is that cute and adorable to bring your kids in? I think it's cute. Or is it weird? It potentially could lead to total heartbreak, you know, if the kid's just like, yeah, this is the... It's actually a dry spot now. This is a Pokemon hot spot, and then the Pokemon dry up a week in after moving in.
Starting point is 01:55:51 Good take. And they're like, what do we do? be doing here. I know we just moved in here, but we got to, I, I got to leave. We got to get closer to the Pokemon. We got to, we, mom and dad, we got to book a wander. We got to find our happy place, find our Pokemon hunting spot. Book of Wander with inspiring views, hotel graded in many, his dream beds, top tier cleaning in 24-7 concedure service. It's a vacation home, but better folks. Um, yeah, I don't know. I think, uh, I think I like the idea of bringing the kid, the six-year-old into the, into the house purchase process. But,
Starting point is 01:56:22 let's leave the phones at home and let's develop a taste for let's become snobs how about that six year old become a snob and say no i couldn't live anywhere but i don't like the cafe the upper east side because i have strong opinions about this get into the lore what happened on this street that's what the six year old should do i'm fine to delegate to the six year old but not to niantic the company behind pome go anyway our next guest is waiting for us in the restream waiting room we have have a mod from Mercury. How you doing? Good to see you. Welcome to the show. Great to see you. Hey, good to be back. How are you doing? Great. It's been a little bit. Yeah, it's been a while. A little bit. Give us an update on what's going on in your world. And then I want to just talk about all the crazy stuff that's happening in tech, honestly. Yeah, let's go for it. We just announced today on Fortune Termsheet that we had three years of profitability. Whoa. Whoa, whoa, whoa. I'm excited about. All right.
Starting point is 01:57:22 sound effects. Founder. That is wild. Full founder mode. Yes, it's been a one weird trick. VCs hate this. VCs hate this. One weird trick.
Starting point is 01:57:36 Area man discovers profitability. Congratulations. It's actually really funny. I love my VCs, but there's definitely every board meeting they're like, okay, how can we spend this money a little bit faster? I was actually talking to some CEOs just like last this week.
Starting point is 01:57:51 And, yeah, most of the time, like, spending more money doesn't mean you grow faster. And that's kind of, like, what I was trying to celebrate with this milestone that, like, it's, you know, we're building software companies there. At least we are, but not everyone. Yeah, I got a pitch for you. You're in a strong financial position. You should put out a press release, say, we don't need a backstop. We don't need a government backstop. We are, we are profitable. I didn't know there was a backstop available to me.
Starting point is 01:58:19 I was ready to use, when those comments started floating Wednesday, I was ready to use the analogy of like, what would happen if there was a government backstop on venture-backed corporate cards? I mean, I mean, truly, you are a bank and you've obviously lived through, you live through the financial crisis. I can't say he's a bank. No, okay, okay. We're not a banking.
Starting point is 01:58:42 We're not a partner with banks. But you're obviously intimately familiar with the financial system. Do you have any unique lessons or stories that you tell from the financial crisis, warning signs that you pull from, even just like movies or books or something that you keep going back to is like, you know, we want to stay sharp on this. You know,
Starting point is 01:59:05 I think like building trust in what Mercury does has always been, it's actually probably the hardest thing we do, right? Because, you know, we're asking a lot from people. We're storing all their money for their business. And, you know, it's like I had this conversation when we first launched. much mercury with someone who'd raised $10 million. And he was like, this is more money than I've ever seen in my whole life. I'm not going to give it to a startup banking service. And now we have customers that have more than $100 million with Mercury. So, like, building that
Starting point is 01:59:30 trust is always so important. And, you know, obviously, I don't know which financial crisis you were referring to. I was in 2008, actually. But I do want to talk about SBB as well. It's like the SVV crisis. Yeah, of course. Take it through the SUV crisis. Yeah. what were you the biggest lessons from the SVV crisis? Yeah, I mean, the biggest thing for us was it can't just be all talk, right? Like if I, yeah, and I was talking to founders saying like, hey, you should trust Mercury and they'd be like, oh man, like, SVB is a 40-year-old company and they just like, you know, are dying.
Starting point is 02:00:08 Why should we trust Mercury? And then, you know, what we did and what we've continued to kind of invest in is to make it so it's not just about trusting Mercury, it's about like having products that that deliver, like, trust. So, you know, we have, like, accelerated FDIC insurance, which is, you know, I guess on the website, we published 5 million, but it's actually, most accounts get way more than 5 million.
Starting point is 02:00:28 So almost all the deposits that sit at Mercury are FDIC insured. Yeah. And that was the big thing. I think SVB was at 80-ish percent uninsured deposits. And that's just because, just for anybody that's not super familiar, that's because they had standard $200, quarter million dollar of FDIC insurance, but almost every account had well beyond that because they were startups, especially VC-backed businesses, you know.
Starting point is 02:00:55 And also in a zero-interest rate environment, in a zero-interest-end environment, you're like, I'm just going to keep it in the checking account. I raised a $5 million seat. I'll just literally keep it in the checking account because, like, what's the point? If I move it to some other vehicle, I'm going to earn 0.5% or 0.1%.
Starting point is 02:01:09 Now it's a different environment? Yeah, was it always the plan to get profitable, stay profitable as fast as sort of reasonably possible? I think like... Or did that happen because of SVB? If people are trusting you as like a financial platform institution, it's not exactly comforting to know that you're burning money and your financial institution.
Starting point is 02:01:28 Because I think the other, you know, the other thing that was part of SBB was, you know, like people just had so little time to move all of their banking operations. And it's just like there's so much reliant on a bank account for a business. It was such a funny moment. I was with SVB. It collapsed. And I was like, I need to go someplace less risky.
Starting point is 02:01:46 I'm going to first report. Republic. And then that collapsed to it. I was like, but that feels like an even old. I don't know of which one's actually older. I think First Republic was older. But it was this odd thing. Total, total refutation of the Lindy principle. It was like, actually go to Mercury. Would have been better off. Sorry. I was going to say, I think one of the parts of that was like, you realize so much of operations. And it's even more extreme for Mercury customers because, you know, we do more than banking. So we do. people's credit cards and invoicing and we've been building more and more features and our
Starting point is 02:02:21 customers are reliant on a broader set of features so yeah i think part of like being profitable and sustaining profitability is being you know making that promise to our customers that this is like something that's going to stick around so if they use us it's not like you know one day we don't raise a funding round or you know whatever there's like a crash or something and people have to like worry about that yeah so it's definitely like a cool part of that kind of having that trusted service. Yeah. Where are you guys getting the most leverage from AI?
Starting point is 02:02:50 Yeah, good question. The biggest thing we've done is like back office stuff. There's just a, to run a fintech, especially kind of on the banking side, there's just a ton of people that have to do a lot of back office things. And, you know, that's like you have to upload a formation document and someone has to review it. And, you know, all of these things are very ripe for AI to just, you know, smooth it out, either like automate 100% or automate it 80% so that like a human can just review the
Starting point is 02:03:19 review the answers so yeah lots of stuff on compliance risk back office is being our biggest investment we've got lots of you know I'm not I guess a lot of what we do at Mercury is like not kind of follow fads or whatever so we've done a bunch of things that are user facing but you know we try to add like little bits of value here and there
Starting point is 02:03:38 so yeah we have a chat bot so you can ask your questions and that's been actually actually it's sort of like 40% of tickets, customer questions are just like, you know, hey, when's, yeah, how long does it take for, why it is that all? And like that, yeah, those kinds of questions. And that's just a better customer experience. And, you know, we think about. And that's 40% over the baseline of, like, because most, most, most customer support systems, like, if you email in, it would just previously do, like, fuzzy database search to be like, oh, it seems like you're asking about
Starting point is 02:04:09 this FAQ, you got a 40% lift on top of that, because you were already doing best practices. imagine? Something like that? Yes. It's a bit more complicated because we still have that email system when people email it in. So this did replace like the suggested topics and like the live messaging system. So yeah, I don't know exactly. What about on the cost side?
Starting point is 02:04:31 The suggested answers was way, way less because people don't like really know. Oh, yeah. Just things like they just want an answer and then we'll move on. What about on the cost side? We were talking to Ivan Notion. He said that his inference bill was actually. actually having an effect on his gross margins. You've remained profitable. But if I'm looking deeper in your financials, will I see an uptick in inference bills based on actually, is that
Starting point is 02:04:58 something that you're actually trying to manage as you stay profitable? Or has it been low enough that it's not? Yeah. And I think it's wildly different. I think using it in back office context where you're sort of processing forms and PDFs and things like that versus Ivan where people are like, I'm going to generate, you know, trillions of tokens because I'm just making long documents. Totally. On the back office side, it's mostly being a cost saving, right? Like you're kind of giving leverage to humans and... Or you're moving off of like a mechanical turk or scale or something like that for whatever workload it is.
Starting point is 02:05:32 And, yeah, on the user-facing side, it's, you know, I think a big difference for us is, you know, we have 200,000 plus customers, but it's not that many customers, right? like a lot of like consumer services of millions of customers. Like each of our customers is obviously like much more valuable. And also you probably don't have like a ton of DA. I mean, you probably have a lot of DAUs, but you don't, it's not the app that people open when they wake up. Yeah. You know, it's a great.
Starting point is 02:05:57 It's much more utility than productivity. People are actually kind of happy. You need to do something. That's why you do it. Exactly. And so and so the inference is just going to be so much lower than if Instagram is all of a sudden being like, we're going to run an LLM query on every, on every, on every, you know, photo that gets
Starting point is 02:06:13 uploaded, it's like, okay, well, that's a huge amount of inference. What? Yeah, and we're doing more and more, though, so, you know, potentially, you know, when you're churning through all these transactions, they'll go up. I don't expect for us. We also make way more on a per customer basis than, actually, I don't know about notion, but than like most prosumer consumer type things. So I think
Starting point is 02:06:31 it would net out that it would be a smallish percentage. Yeah, that makes awesome. What are you seeing across your angel portfolio? you've been you've been investing like i guess for context i've done about 350 investments in the last 10 years i don't know if i need a go for that and i just actually announced earlier this year my i have like a formal angel fund now it was a 26 million kind of angel fund i mean i'm still mostly investing in the same way
Starting point is 02:07:06 i mostly invest because it's just fun to talk to founders I guess the same reason you guys run this podcast. It's like interesting to talk to founders and learn new ideas. It's all, it's very AI focused right now, as you can imagine. You know, what's, I think AI is kind of fun to invest in because like every six months where you're investing in like changes a lot, right? Like for a while it was kind of dev tools in AI and now I feel like that's done. So most of what I see right now is kind of deeper, vertical, kind of, kind of
Starting point is 02:07:39 B-to-B applications of AI. Do you think AI gives people the permission to invest in more unique categories? Because basically, like, there used to be, like, a whole set of categories that were just kind of like small businesses or industrials or some niche, some weird niche thing. And they would just be kind of off limits because it would be like, oh, no, no, no, like the real money is made in fintech. The real bunny is made in maybe the ed tech, but that one's difficult. Or ad tech or marketing, MarTech.
Starting point is 02:08:15 You know, it's like there were these easy themes, but with AI, you can go and actually put together a pitch around something that's maybe more or less untouched by tech and actually generate a deck. It's definitely like, you know, this extension of the software is eating the world kind of vibe that Mark Andreessen talks about, right? So you need a little bit more of the world on the edges. Like 10-ish years ago, it was all like social and mobile, right? Like that's why everything was. And then we started doing, like even Mercury's, like 10 years ago, it was probably off limits to like try to make something that went after the banking market. And I remember when I started in 2017, it was like, you know, it sounded weird, right?
Starting point is 02:08:58 Like it was not something that was done. I think the thing that AI does is like open up labor, right? Like in robotics, it opens up physical labor and then in the case of kind of the white color workers, it opens up like human kind of intellectual labor. And that is like a whole set of categories. So, you know, people, generally VCs hate services firms, right?
Starting point is 02:09:23 Like, it's like, they'll be like, oh, there's like a services firm. But now, like, services firms are cool. You're like, oh, yeah, it's human. It's human as AI plus human services firms are like the thing that a lot of people are investing in ideally trying to get to like fully human only so fully AI only but yeah it's definitely like opened up a lot of industries that wouldn't have been interesting from a investment perspective
Starting point is 02:09:48 do you think people have been overly bearish on on SaaS it's you know notable that open AI uses everything from Salesforce to Slack to Qualtricks and I imagine you use a lot of on your business I don't think anybody's out there saying I'm going to vibe code a bank even if you could somehow build that on like a banking as a service platform it's like the level you know the amount of work you need to do
Starting point is 02:10:16 to maintain a vibe coded bank on top of you know it just doesn't make sense I mean I think there will be a wave of disruption that will happen in SaaS but I don't think the answer will be everyone's just making their own end to end
Starting point is 02:10:32 like sales force like that just that sounds ridiculous to me i do i mean i hope someone builds a much better sales force uh but it'll still yeah and maybe their pricing won't be like per seat and it'll be like per sale or whatever but i think the category will exist the shape of the category will change the business model will change but i don't think the answer is everyone's building their own sales force like that just that just sounds inefficient like it just sounds ridiculous to me but uh there will be like i think actually like the incremental new applications people build will actually make will just be more applications, right? Like if the number of vendors you had
Starting point is 02:11:04 was like 10 and like you're paying a lot, like maybe you'll have another 90 internal apps that like people just made for like small use cases. So like the amount of software people will use will go up, I think because of type coding and the ease of coding. But I don't think like these big categories are going to
Starting point is 02:11:21 just disappear because like people can now make it themselves. And do you think companies will go multi-product sooner and are you seeing that across the portfolio if the actual engineering hours or less to go from one product to a second to a third that will just encourage more companies I think it's going to be like industry specific
Starting point is 02:11:41 I mean I think a big story about fintech in the last couple of years has been like every fintech going multi-product right I think it's partly because when we started in 2019 like no one was building like a better neobank so that's all it took like our first product was a bank account
Starting point is 02:12:01 and it was like really good now to compete against Mercury you would have to have bill pay and credit cards and invoice them because like that's just what people look for like they look for more of a bundle product so I think it depends where you're competing
Starting point is 02:12:15 and to some extent it gets a little easier because you can just make products quicker but making like amazing products is never easy right like I think you just like the quality and craft and all of that do matter and we try to invest
Starting point is 02:12:32 a lot like when we make a new product it's never easy it's like it takes a lot to make an amazing product but when you're like very new to a category and like almost everything in AI is like very new you can build more quicker and then like kind of refine it over time but it's a good question
Starting point is 02:12:48 I think a lot of the ways people will end up competing with incumbents right like I've heard a ton of like I don't know why we're picking on Salesforce but I've heard a lot of like how do we beat Salesforce kind of strategies and a lot of them are like oh actually we're going to bundle like a i sDR and so you know x and y and like we're going to make it like a much more fully featured kind of version of like sales force
Starting point is 02:13:08 so i think like when you compete against income is like doing a bundle strategy sometimes is like a better strategy so like we'll see that play out yeah very cool well congratulations to the whole team congratulations a three-year milestone looking forward to the fourth the fifth six and hopefully hundreds of years of mercury profitability. You've got to get past the SVB mark become the longest-standing institution. It's funny. They were started like a year before I was born,
Starting point is 02:13:39 so I have this exact mental motive. Like they were started 42 years ago. That's amazing. Maybe I'll stop thinking about that when they're going to further down. Very cool. But yeah, really appreciate you having me on. Yeah, we'll talk to you soon.
Starting point is 02:13:53 Thanks to the whole team. Have a good one. Cheers. Bye. You heard about the importance of AI agents for customer service. You've got to head over to fin.AI, the number one AI agent for customer service, number one in performance benchmarks,
Starting point is 02:14:05 number one in competitive Bakeoff. Banking world champion. Ranking on G2. And he was also mentioning better CRMs. Why not try out Adio? Customer relationship magic. Adio is the AI native CRAM that builds, and grows your company to the next level.
Starting point is 02:14:20 Brett Adcock fired back at the hype around 1X, presumably. Fake your CEO, Brett Atcock says his, quote, major competitors are using teleoperation, human operators in their videos and calls it deceiving. Will he call out 1X by name? Will he call out Elon by name? Will he call out Unitary by name?
Starting point is 02:14:42 Let's find out by watching this clip. His analogy, it's like if a self-driving car pulled up and we found out there was some guy in Tennessee. A demonstration of a robot doing something that's like not teleopt or. coded. And what you're seeing now in the world is like, I would say most major competitor to us at this point are putting out most of their content and updates teleopt from a human. And I think it's like perhaps some of the most deceiving things I've ever seen. It's like if I was driving a self-driving car, if a self-driving car pulled up next to us, we found out there's some guy from
Starting point is 02:15:17 Tennessee driving it. We'd both be like, that's not what I thought this would look like. What you want So the funny thing is that, Tyler, why are you laughing at that? Well, that's like exactly how self-driving cars work right now. Teleop is actually maybe critical to the development of full self-driving cars and maybe used by all the top companies all the time. Yes, I mean, it is odd that like... I thought it was, I think it's like a key, it's the key enabler for 1X to actually get robots in homes and getting them to a point where they can do valuable work.
Starting point is 02:15:56 Yeah. Right? And they're going to be taking a loss on, they're going to be taking a loss on that. They're charging like $2 an hour. But that's going to be, you know, a flywheel for them to get more and more training data. Oh, the founder of OneX actually replied, I think, Dar. That's the founder, right? No, he's the Dars.
Starting point is 02:16:16 Oh, he's just the VP. Oh, but he's just the loudest voice on X. So I give him all the credit. He created one of the X. nothing without him. Anyway, he says, so basically Waymo, there were two operators to every car when Waymo started. It is interesting that, yeah, Waymo has never, that I know of, they've never really, like, published data on how many operators they have per car. I think that that would actually give people a lot of reassurance around, like, okay, these cars are driving around? Are there
Starting point is 02:16:47 people over watching them? And then you just get to a point where you say, okay, yeah, there's, there's 10,000 operators for 5,000 cars. And then next year, there's 8,000 operators for 5,000 cars. And eventually there's 5,000 operators for 5,000 cars. And then eventually there's 1,000 operators for 5,000 cars. And eventually there's one operator for 5,000 cars. But it's just a slow process of getting people up to speed on the new technology. And it just gives people confidence at every turn, much like the confidence that I feel
Starting point is 02:17:18 when I go to sleep on my 8th sleep. How did you sleep last night, Jory? 88 from me 7 hours 38 minutes I want to hear of that soundboard still has my favorite song on it boom you got an 88 I'll give it to you
Starting point is 02:17:32 wait do we tie now I got an 83 oh 83 okay rough one rough one well we have breaking news from the Pope he chimed in on the AI bubble some bub talk from
Starting point is 02:17:42 from the papacy a papal bull is that a top signal on the bub talk no no it's not it's not on the bubble it's not on the bubble it is on the bubble. It is from the Pope, though, who I should be following. Why am I not following the Pope? Technology innovation can be a form of participation in the divine act of creation. It carries
Starting point is 02:18:06 an ethical and spiritual weight for every design choice expresses a vision of humanity. The church therefore calls all builders of AI to cultivate moral discernment as a fundamental part of their work to develop systems that reflect justice, solidarity, and a genuine reverence for life. I like that. Good line. Liquidity thinks it's AI generated slop. Very bearish. No.
Starting point is 02:18:36 Yes, he used one M-Dash. Like, the M-Dash is not a tell-tel sign. You can just use one if you're just hammer it on the keyboard. Or you can just write a note and then in the edit, someone can use an M-Dash. I don't know. Is the Pope one-shotted? We'll never know. I think this is fine.
Starting point is 02:18:55 I think this is a nice little call to action. What do you think, Tyler? You put this in here, right? Yeah. I mean, what are you guys doing if the Pope comes out and says, like, I actually don't care about AI margins? What are you doing? I would be extremely bullish on that.
Starting point is 02:19:09 What I'm worried about is the Pope coming out and being like, actually, like, I've discovered some novel physics. I've been chatting, I've been talking to GROC and I've been talking to chat GPT for just like thousands of hours. People don't think about the recursiveness in the Bible. Yeah, no, yeah, recursiveness in the Bible. Actually, it's crazy. You know, I'm the Pope of the Catholic Church,
Starting point is 02:19:27 but I've been getting into polytheism recently. I went down a rabbit hole, and I think there might be more than one God. I think there's multiple gods. A.B. in the chat says, got to get the Pope on Vanta. Totally agree. 100%.
Starting point is 02:19:41 I think it would be interesting if the Pope started coming out and said, we're exploring, taking equity stakes in technology companies. Fannie Mae and Freddie Mac. Yeah. We'll see if we got to get the Pope on Bezell. Because your Bezile concierge is available now. A source of any watch on the planet.
Starting point is 02:19:58 Seriously, any watch. Yeah, the Pope already has the drop top six wheel G-wagon. Is it six wheels? I think so. No way. I always thought it was a rules. No, it is a G-wagon. But there's, I feel like he has a wild set of cars.
Starting point is 02:20:14 It's right. No, it's not six wheels. No, it's not six wheels would be so. But it is a one-of-one. made it. They made an EV. I love it. Speaking of EVs, the Ford F-150 Lightning is potentially going out of production. No way. It is close to failing. The Wall Street Journal has a story on the front page talking about, where is this? Pull that up. The Ford F-1-F-1-F4Ways ending electric F-150 truck. Executives at
Starting point is 02:20:49 Ford Motor are in active discussions, active talks, they're in talks, about scrapping the electric version of the F-150, people familiar with the matter said, which would make the unprofitable truck, the U.S.'s first major EV casualty. Maybe this is, do you think this is all triggered by T.J. Parker? Possible. World famous venture capitalist and founder himself. He bought a Ford, but he didn't buy an electric one. He bought a RR. Why they should make, why don't they make the electric one
Starting point is 02:21:25 like the Raptor R? Like with an internal combustion engine? Exactly. I feel like if they want people to buy the electric version of the truck, they should give it like an exhaust and an engine and gas. A real engine note.
Starting point is 02:21:39 Yeah, real engine note. When I have my Raptor, I... And it's more cylinders. For people, anytime I was on a call in the car, people would have... assume that I was driving a sports car because there's so much engine noise in the cabin. It was ridiculous. Yeah, I mean, the Raptor R has, how many cylinders does it have? Is it an V8? It is, right?
Starting point is 02:22:00 I thought it was a... So it has, it has eight cylinders. The F-150 lightning had zero cylinders. That could be why it's not selling very well. The Ford Raptor R had 5.2 liters in its engine. The F-150 lightning had a zero liter engine because it didn't have one. So it makes sense why this has been struggling. And we will see what they wind up doing. We got to change the subject because all this Raptor talks got me thinking about getting one again and pull the trigger. Well, if you think it's bullish or bearish that Ford is pulling the Raptor, no, they're going, they're doubling down on the Raptor. They're pulling the lightning. Get over to public.com investing for those that take it seriously, multi-asset investing, industry leading yields. They're
Starting point is 02:22:46 trusted by millions. Quadrillion. What else is going on in the markets? Hero, Nata, says, my inbox is full of people breathlessly trying to interpret this. And it is one-year look at Spy. It's over. It's light dip. You can interpret this in two seconds.
Starting point is 02:23:03 It's over. It's just completely over. The bubble popped. Now it's ready to start rebuilding. It was a good run. We can go up from here. Because the bubble has popped. You're welcome.
Starting point is 02:23:11 Yep. It was rough. We got through it. We are now. It's almost a weekend. We are now past the trial. off of disillusionment, we're in the plateau of productivity. That's good. That's what I see. That's good. That's good. And pretty soon, oh, the stocks have already stopped trading. Yeah.
Starting point is 02:23:25 So we're safe now. We're safe. You can't hurt us anymore. The economy can't hurt you on the weekend. There's, there's so many more charts. It's really chart talk time. Consumer sentiment, almost back to its record low. People are not happy with consumer sentiment right now. You see this to Bank of America? Yes. Just, as of yesterday, just fully recovered from the global financial crisis. Wow. Took 19 years. To the day and then it's like back in another crisis.
Starting point is 02:23:55 Can you imagine? He's like, no, we just dug her way out of the hole. No, fantastic, famously backstopped by none other than Warren Buffett. Yeah, it's been interesting to follow, I mean, Berkshire Hathaway is green this week. He's got so much cat. So much cat. building cash pile. So it's $1.07 trillion.
Starting point is 02:24:17 Maybe the least AGI-pilled person in the world. Cash-pilled. Imagine if, imagine if Berkshire Hathaway just comes in and just acquires like 90% of Open AI. He likes those authentic greenbacks. AG, what's the eye for? Authentic greenback incentives.
Starting point is 02:24:39 I don't know. Daniel Eath says, as commenting, so NVIDIA put out something on Wednesday. As I've long said, China is nanoseconds behind America and AI. It's vital that America wins by racing ahead and winning developers worldwide. Daniel says, this is an insane statement. America is more than nanoseconds ahead of China and AI. The actual gap is months, and it would be larger too if NVIDIA stopped the flow of advanced GPUs to China.
Starting point is 02:25:12 So putting him in the truth zone. There are some more information on the SNAP. Snapchat deal. Snap gets $400 million, which is greater than perplexity's total revenue. Snap gives nothing except access to... Wait, wait, wait, wait, whoa. It's more?
Starting point is 02:25:31 What? They're paying more than their revenue? How does that work? Perplexity is taking VC dollars, giving them to SNAP for distribution. There's also an equity. All the SNAP holders are...
Starting point is 02:25:49 Sorry, this is extremely confusing. Let's read this. The deal looks incredibly in SNAP's favor. Snap gets $400 million, greater than perplexity total revenue. Now, Signal is saying this is most likely $399 million of perplexity equity to SNAP,
Starting point is 02:26:06 not cash. So there is a question about how much of it is cash versus, so, yeah, the raise VC dollars give it to SNAP is one theory, but it is possible that they're just giving $400 million of equity, like a stock grant. I think it's a, it's some split. Yeah. I don't know if they published it.
Starting point is 02:26:26 But, like, signals thesis is that it's $399 million of equity and like $1 million of cash. And so it's basically just like, we want distribution in SNAP and we're willing to give you equity to pay for it because $400 million on $20 billion is 2% of the company, right? Something like that. Yeah. It's crazy. 2% of the company. Snap gives nothing except access to an unloved AI chat.
Starting point is 02:26:50 Perplexity gets indirect access to zero income teens. Sasha is not a fan of this deal. Spiegel negotiation master. Go back to the conversation we were having, I think, Saturday. We had a couple hour drive and we were basically doing a podcast with me. No mics, but I was saying I was surprised that Snapchat has not figured out a way to just monetize all the capital that a lot of these consumer AI companies have given their massive, massive user base. Well, it's because it's because the numbers are like all flipped around. Like Snap is is a, I think, like way more mature on all the business metrics and yet worth half perplexity. right like look at the look at the difference in terms of like business metrics d a u's uh page
Starting point is 02:27:40 flexity does not want will not want people comping it to snap and their next financing yeah it's odd i don't know it's it's it's some sort of like uh you know dance to actually uh integrate these two things uh taunay uh says snap will be integrating perplexity directly into snapch's chat interface, Perplexity will pay SNAP $400 million over a year in a mix of cash in stock as part of the deal and gets access to SNAP's 900 million monthly active users. AP in the chat, Snapchat's going to accidentally build AGI, just trying to make the dog filter blink realistically. I love it.
Starting point is 02:28:25 What do you think, Tyler's out a pathway to AGI? Well, I mean, look, Spiegel might be back. You think so? I mean, look, we know someone who was previously had a big position in Snapchat. What do you think he actually gets? What do you think he actually gets? I have a lot of trouble understanding what perplexity is because it's somewhat of an application layer company, but a lot of the APIs, a lot of the foundation model products, like, do deep research.
Starting point is 02:28:55 Some of them have news hooks already. So if my choice, like Apple could have gone to perplexity. They went to Gemini. like does Gemini give you search results the way perplexity like perplexity's initial pitch was it's a LLM but there's no cut off and we'll search the web for you I feel like most of the LLMs just do that out of the box now and so like what are you getting when you partner with perplexity over something else what's the differentiating you're getting cash that's why I think a lot of this deal had to have been cash because I think if I'm Evan Spiegel and I'm looking at like perplexity's revenue and user base relative to his own. Yes. And you're not...
Starting point is 02:29:36 Okay. So help me understand this math. If I'm Apple, how many DA used does Apple have? It's like the same size as Snapchat, right? In terms of like actual iPhone users, it's got to be a billion, right? There's like a billion iPhone users, something like that?
Starting point is 02:29:58 I think it's more. Like 1.5. Okay, so, and then Snapchat has like 400. Oh, wait, no, Snapchat is almost a billion, MAUs, but DAUs is like 400, right? Yeah, like 470. Okay, so let's say that Snapchat is roughly 40% as many users as Apple. So it's 40% to Apple's, you know, billion plus 1.5 billion.
Starting point is 02:30:23 Apple is paying Google one billion. Snapchat could potentially go and pay Google for access to Gemini 40% as much, right? Because they're 40% the size. So they could pay 400 million and get the exact same quality of AI product that Apple is vending. You'd think because Apple says, hey, or Google goes to Apple and says, hey, you want to service a billion users? that'll be $1 billion. Oh, you want to service 400 million users.
Starting point is 02:30:56 That'll be $400 million, because it's billion, because it's basically a dollar a user. I don't know. Google's basically charging a dollar a user, but perplexity is in the opposite scenario where they're paying for the right as opposed to the other way around. Yeah.
Starting point is 02:31:10 The definition of perplexity is inability to deal with or understand something complicated or unaccountable. It's perplexing. It is perplexing. Similar to confusion, bewilderment, puzzlement.
Starting point is 02:31:21 Yeah. I mean, this was a little bit of my of my question with Mark German and he was like, no, no, no, and he kept shutting it down and I think he has a strong opinion and he might wind up being right, but his my take was when a platform, when an
Starting point is 02:31:35 aggregator, partners of the foundation model lab, which direction should the money flow? And he was like, obviously it should flow from the application layer to the foundation model app. So obviously Apple should pay Google because
Starting point is 02:31:50 Google is inventing a technology and then selling that technology to Apple. I think it, again, it would have looked a lot different. It would have been flipped if Apple was just, if Gemini was becoming the default assistant within Apple, which they would never probably allow, right? Well, they would if they're getting paid. Potentially if they were getting paid $20 billion. Exactly.
Starting point is 02:32:12 Which is like how I think this plays out, actually. Yeah, which is. I think somebody, maybe it's open AI, but I think somebody will be paying Apple for the right to funnel those LLM queries. wherever they want and take the cut just like what happens in search. But that's a hot take, and I respect people that don't agree with me.
Starting point is 02:32:30 In other news, Deutsche Bank is exploring ways to hedge's exposure to AI to data centers. It's looking at options including shorting a basket of AI-related stocks and buying default protection via synthetic risk transfers. So they're hedgemen.
Starting point is 02:32:46 So they bet big on data center financing. The scale of the expenditure on AI infrastructure has prompted concerns that a bubble is forming with some likening the enthusiasm to that which preceded the dot-com crash. Skeptics have pointed out that billions of dollars have been deployed in an untested industry with assets that quickly depreciate in value due to rapid change in technology. Stream Yimbi in the comment says if there's one thing Deutsche Bank is really good at,
Starting point is 02:33:11 it's being very exposed. So in recent months, Deutsche Bank has provided debt financing to Swedish group Eco Data Center, as well as the Canadian company 5C, who together, raised more than $1 billion to fuel their expansion. The investment bank does not break down how much money it has lent to the sector, but it's estimated to be in the billions of dollars. Hedging exposure to the industry could prove difficult because betting against a basket of AI-related stocks in a booming market will be expensive. Meanwhile, SRT transactions require a diversified pool of loans to earn a rating and investors to likely demand higher premiums.
Starting point is 02:33:50 hyperscalist pursuit of superintelligence has fuel demands for infrastructure that will help them build it. I mean, this is everyone from the haunted mansion proprietor. Like, everyone is getting in on the action. Like a few years ago, like the, like, it was like if you have anything that's related to AI, like go, go, go, raise money, grow it, spin it, flip it, turn it around, pivot it, whatever you want to do. Europe is expecting a wave of deal-making consolidation and digital infrastructure, and the cost estimate from the financial times on the total AI buildout, $3 trillion, $3 trillion bucks, pretty wild times. Business Insider has a scoop here that Google is looking to invest in Anthropic at a $350 billion valuation. It's interesting. Anthropic seems, what, one quarter the revenue of Open AI, and yet from a vibe perspective,
Starting point is 02:34:52 from a public attention, from a PR crisis perspective, Anthropic used to be in trouble every other week because Dario would be out there being like, paperclip, we're going to paperclip everyone, you know, we're going to get rid of everyone's job, and it was like really aggressive, right? And so like, like, they had to back stuff up, but they had to backstop, they had to backstop their comms a couple times. it's been great. And Dario's looked great for the last six months. And he's just been beating the drum on, hey, there's geopolitical considerations here with China. But overall, we're building and we're happy to be an API. We're happy to be, we're happy to help businesses. Yeah, generate
Starting point is 02:35:32 code. Running the experiment of like highly focused lab versus highly, not distracted, but taking a lot of shots on goal with opening eye. could we could be looking back on this in a couple of years and being like, man, they really missed erotica, they really missed a slop. Consumer electronics. They missed consumer electronics. It's possible. But on the flip side, it does seem like they've built a, you know, a bit of an elegant
Starting point is 02:35:57 business, and it seems to align well with Google's strategy. And every hyperscaler wants a piece of it. Yeah. I mean, similar to how Microsoft has access to open AI models on Azure, you know, Google, bringing Anthropic in through an increased investment. Also, I just wonder, is it hard to do a deal at $350B these days in the venture community? Like, there was already that reporting that Dario had to go all over the world to scrap together the last round. And I'm wondering if there's a world where you're trying to $40 billion, you should just go to the hyperscalers.
Starting point is 02:36:32 Well, yeah, they're using every possible capital source. Yeah, we've seen some of these SPVs that have like, you know, 0% carry, 10% management fee up front. So they're tapping everything. Let me tell you about adquick.com. Out of home advertising, easy and measurable. Say goodbye to the headaches of out-of-home advertising. Only ad-quick combines technology. Out-of-home expertise and data to enable efficient, seamless ad-buying across the globe. Jordi, where'd you want to go next? I think we should bring on our next guest, our first and only in-person guest. Let's do it. Jordan Castro. He's here live in the TBP and Ultradome. And we will bring... Muscleman himself. He is the author of Muscleman and the novelist. Wait, you're... Was that your first novel? Did you write a novel called The Novelist? yes what inspired that that's it's kind of like uh it's recursive yeah oh well i i wanted to say first all thanks for having me oh of course of course um you know i yeah i um and shouts out to norman plays
Starting point is 02:37:25 in the chat someone just showed me they said we're here for jordan casters oh let's go there we go shout out norman um um but yeah i i i i write novels but also um you know you guys had the pope up talking about the pope and i work with uh this guy luke burgess with this thing called the Clooney Institute. Oh, no way. And one of the things I've noticed we sort of do stuff at the intersection of like what we say, Athens, Jerusalem, and Silicon Valley.
Starting point is 02:37:49 And so sort of bringing religious wisdom into the tech conversation. But so I've been like, I'm not really related to tech, but I've been like at all these different like talks and so on about tech and whatever. And I noticed that when people normally talk about AI, they sound like AI. They immediately just sound like AI.
Starting point is 02:38:04 And when you guys were doing your AI quadrant thing, I was like, they sound like people. Interesting. And I was like, I'm like, why do I? Let's give it up for people. Yeah, yeah, let's give it up for people. That's right, that's right, yeah. And I was like, why do I like TVPN, you know, because I shouldn't like, I mean, you know,
Starting point is 02:38:17 I shouldn't like it or whatever. And I was like, because you guys, like, talk like people and you talk about people, you know what I'm saying? And so much of the conversation about tech. Yeah, there was a, it was very deliberate to put the faces of the people there, not the logos of the companies. Yeah, and I noticed that they weren't particularly flattering photos. Yeah, yeah. Oh, interesting. You mean we didn't make them into gig chats?
Starting point is 02:38:37 We would, a lot of times we'll give them muscle. Well, we will literally make the muscle men. We have fun with that. For that, that decision was actually more just about, like, how do you actually fit them all in? Like, if you make them the muscle man, then you have to make it smaller, you can't see that face, facial features. No, I mean, some of those photos are, like, their hero photo that, like, their comms department clearly, like, put on the Wikipedia page or something. I don't know. Or, like, their corporate photo.
Starting point is 02:39:00 You don't always get to pick what photo becomes you on the internet, but sometimes. Trust me, I know. When people get mad at me on Twitter, I've been publishing since I was, like, sick. Oh, do they go find a stupid photo of you? They find pictures of me, like, in high school. wearing like a pink button up and they posted in their, you know, comedy
Starting point is 02:39:12 that makes sense. That makes sense, yeah. Yeah, I've, I've, I've, some silly photos from, from, of old that I'm sure will resurface when people want to dunk on me. Break this post down for me, actually. Like, what, what was your read on this?
Starting point is 02:39:24 So, technological innovation can, can, you think you used AI for it. Do you think you did? Well, I think the worst, actually, the, the worst thing about AI is that people are saying you can't use M-Dash's in the way. I love the M-Dash.
Starting point is 02:39:37 Okay, okay. So I don't love the M-Dash. It's not that I hate it. I'm just indifferent to it. I didn't, I don't even know where it is on the keyboard. And so like, you do minus sign,
Starting point is 02:39:46 minus sign, and then space, I think to generate one. I think you, I know, double tap, double tap space. Shift option dash.
Starting point is 02:39:54 So like, isn't it just double tap space? And I go here. You can double tap it, but sometimes it does show. You're like, this is the story. And then I want to do an M dash. Yeah.
Starting point is 02:40:02 And then space. Oh, putting on an absolute human M dash clinic. That's right. That's right. That's right. That's right. that's an underscore dude look underscore underscore maybe it's just something that I know how to do
Starting point is 02:40:13 test underscored test I just run it double double dash space so so so I completely agree with you so I wrote I write like a little you know 300 word 400 word like summary of my current thinking on whatever's in the news every day I just write it right in Google docs here I don't use any AI I hand it off to Brandon on our team he edits it I don't think he uses it in the AI, sometimes he would put M-Dash in, and if we put in even one M-Dash, all the commas,
Starting point is 02:40:44 we'd be, M-Dash, what was the problem? And it's just like, okay, like, now I just can't use the M-Dash because, like, the masses will go crazy. I refuse to let them take that from me. You're fighting in the good fight, okay. Yeah, yeah, yeah. You think you can win?
Starting point is 02:40:59 I think that it's just, I lost it. I was, and Brandon was like, really, like, and there were places, I'll show you. Sometimes I will literally just write, like, like, a minus sign. Not an M-Dash. I'll just be like, you know, close a quote. I will deliberately break the rules that I know I'm not supposed to.
Starting point is 02:41:15 Like, even when you put something in quotes, you put like the period inside the quotes or whatever, I will just break all those rules because it just makes it clearer that I'm not slopping it up. I don't know. I think you're accepting the frame of the haters. I think so.
Starting point is 02:41:29 I think it might be. I think you're right. I think you're right. Also, shouts out Brandon. I didn't hear any claps for Brandon. Yeah, let's get some claps going for Brandon. we call him in the he's got a new haircut he got a new haircut we need a we need a trading card for brandon's new he's a sleeper muscle man too he's a muscle man i know why i've got i got i got
Starting point is 02:41:46 is it fair to say i got you into it what you're you're the reason he's the reason you got is that fair to say or that he's the reason that you're absolutely shredded and diced he's the reason you're walking around at 2% body fat okay wow wow brandon content that's why that's why you can bench three plates you can just wrap it out because of him. Four. Okay, four. Wow. Whoa. Whoa. Whoa. Whoa. Whoa. Look. Crazy. It's good. Wait. So, yeah, actually tell me the story of the muscle man book. Like, what inspired it. What was the thesis? It's a novel about an English professor who hates being English professor and loves lifting weights. And part of it was that... And this is a real person or...
Starting point is 02:42:28 No, no, it's a fictional character. Okay. Yeah. Yeah. But like, you know, part of it was that I, when I was growing up I basically grew up online and you know it's sort of a classic story but it was like I was totally anxious you know like I would be at school then I would get on the screen and you know I would just post it sort of lurk and be sort of weird and creepy in front of the computer you know and then when I started lifting my anxiety and depression just like completely went away yeah you know and and I was like damn like you know was my dad right my whole life you know are all the sort of jacked guys who are like annoying and saying that you know you got to start lifting where they just all totally right.
Starting point is 02:43:06 The other thing, that tracks with my personal experience. I used to have trouble falling asleep at night. Yeah, that too. And the second I started lifting daily or six days a week, it just went away entirely. And the lesson is people that struggle to fall asleep, at least in some cases, are just not exerting themselves enough
Starting point is 02:43:26 to be just physically tired enough to, like, fall asleep. Started lifting soon enough, it's like, okay, you basically close your eyes, you're knocked out in like five minutes totally yeah yeah and so i wanted to just kind of like and then i tried to find novels or even books about lifting that i thought were like compelling at all and i couldn't find any so i started sort of trying to write um something like that it's also like a satire about higher education there's like a lot of rants and stuff about the universities and stuff that's cool um so um so yeah talk about the reception's been sort of idiotic okay um you know
Starting point is 02:44:00 The media, as we all probably... Let's give it up for an idiotic. Let's get up for the idiotic media. Yeah, yeah, yeah. You know, it's... The idiotic media, and we have a horn going on. We are the idiotic. Yeah, yeah, yeah.
Starting point is 02:44:16 You know, you know, you have a horse wearing a hat in the back. I consider... We're not journalists, but I definitely consider us to be part of the idiotic media. Yeah, well, that's... Now you're using it in a good way. You're reclaiming it, you know what I mean? But most of them are idiotic in a bad way. And the reception has been, like, here's a novel to, like, explain the man-of-the-manosphere, right?
Starting point is 02:44:36 And, like, here's, like, a novel about men. And I feel like, you know, back in the day, like, there's this great, the literary, the writer Norman Mailer has this debate with these feminists, like, back in the 70s. And it keeps calling them lady critics. This is a very hilarious debate. And I feel like in a weird way they're doing the same thing to me, but they're calling me, like, a man novelist. Okay. You know what I mean? Where it's like, this is a novel about a man.
Starting point is 02:44:57 Sure. So, like, it has to be about masculinity or something like that. Yeah, everyone wants to know what's going on with young men. Yeah. And it's this kind of fake discourse. Yeah, I saw Chris Williamson on Tucker the other day. Yeah, yeah, exactly. It felt very much.
Starting point is 02:45:09 I mean, honestly, like, I feel like Chris has done a fantastic job of actually sort of explaining, like, what's going on with young men. Yeah. Like, it is an interesting question. Yeah, yeah. There's a whole bunch of, and I feel like every media outlet has their own way of interrogating this, whether it's Harper's or Vanity Fair or wired. Like, everyone's telling stories and writing profiles about interesting people in young men. and I don't know, it seems like it was somewhat worthwhile discussion to have if it can inform interventions, even just in your own life or with your own kids. It seems like there's some
Starting point is 02:45:40 value to the discourse, but did you have any intention of actually sparking that debate? Or like actually like was one of your thesis, like I need to put this book out so that more people lift weights at young ages? No. But I did write an essay for Harper. That was just basically a straightforward pro lifting essay. And so that had more of that. Sure, sure, sure. That's awesome. The thing that my takeaway from going, I was very anti-Gim growing up because I was a skateboarder and a surfer.
Starting point is 02:46:11 Yeah, me too. Right, exactly. A skater, surfer, like the idea of like going inside, being indoors, lifting stuff up and putting it down just for the, whatever the reason was. It wasn't appealing to me as a kid because I was like, well, I could just go to the beach and surf and, like, be out in nature. and so I had a generally negative sentiment around the gym and I was also like super scrawny growing up I've got like long arms I'm I was like in high school probably like 140 pounds like getting under like one plate on bench was like I'm gonna die at first and I never really got into it I wasn't I didn't play football or anything like that
Starting point is 02:46:50 and then I got really into it in college and it was the biggest unlock for like mental health yeah like i prior to prior to that i was like a you know teenager i was like teenagers i think are just naturally kind of like moody and figuring out who they are and how they fit into the world and all that stuff and then i found lifting and like a lot of that stuff just like went away entirely because i had a purpose uh i would even if that purpose was like so simple and just like going i woke up i was going to eat well and i was going to go to the gym i was going to lift as heavy as i could and go on with my day. And it just, it ended up being like,
Starting point is 02:47:25 it was like a drug almost immediately because I was like, I want the feeling of how I feel after I lift. It was incredibly addictive in a very positive way. So I did that. And then about probably a year or so, I went from like 100, by that point of like 150 pounds to like 190 in like a very short period of time.
Starting point is 02:47:46 Like basically spending, I was, this was from freshman year to sophomore year. and I was spending so much of my energy just focused on lifting. And I ultimately reached a point where I was like, okay, at some point this is actually not the most productive use of my time. And I was fortunate enough at that point to kind of discover work and things that I was passionate about and ways that I could use my energy that was more interesting to me.
Starting point is 02:48:13 And so I think like the debate to have around lifting is like I think like basically everyone should be like, picking up heavy stuff and putting it down, like at least a little bit, right? You don't need to go to the extreme. The advice that I find myself maybe giving like, you know, people that were in my shoes is like figure out the point where you maybe actually want to dial it back a little bit because it's possible and probably healthy to go way too much to the extreme where you're spending like, like I was, like all my time and energy, like just lifting because I didn't have anything better to do besides like, you know, work and school. And then at some point I actually
Starting point is 02:48:50 need to dial it back and actually apply that energy in other places. But I got the benefit of like that period of just like obsessiveness that I still carry with me. Yeah, 100%. I mean, that was sort of actually what the novel is exploring. You know, because when I first found it, like you said, it's like, is this the answer? Is this the ultimate answer? You know what I mean? And it really is. I mean, that's a part of the issue that I take with like the optimizers, you know, like Huberman. I mean, I know you guys had Huberman on or Brian Johnson, where you sort of become like perversely obsessed with health or with lifting or something like that. You sort of tinker. You know, you think of the person, the human person as some sort of like, you know, mix of neurons and chemicals
Starting point is 02:49:27 and stuff. Yeah, it's interesting. Like, even between Brian Johnson and Heurman, I see a huge wide gap. But let's put them in the same bucket. Yeah. How do you contrast your philosophy? I think you need to be injury maxing. Okay. You need to be horsing heavy weights. And I think it's like a spiritual endeavor. I think you need to almost, I think you need to get crushed under body breaking weight a few times. And yeah, it's more of a spiritual process than an analytical process. Yeah, 100%.
Starting point is 02:50:03 It's Eric Bouganagan, who's like my favorite lifter on YouTube, always talks about how it's a mindset. And he has these things where like he'll just fail insane amounts of weight multiple times and then just like work himself into this kind of like crazy mental state and then lift it, you know? What do you think about the other YouTubers? What do you think about Sam Sulek?
Starting point is 02:50:23 Oh, I love Sam Sulek. What do you think about the Tren Twins? I love the trend twins. I love, yeah, yeah, yeah, yeah, for sure, for sure. Sam Sulek, also Sam Sulek is weirdly emo. Yeah. Like he will, you know, his vlogs. Actually, there's this weird thing where,
Starting point is 02:50:35 so my, I don't know what I'm talking about this now, but my brother died like a year and a half ago. And after the funeral, one of my buddies threw on Sam Sulek. video and I don't and it's inspirational yeah I was incredibly inspirational but I noticed I was like Sam Sulek's videos are sort of email where he's like driving by himself yeah he's like talking to the camera he's like taking unnecessary loops around the parking lot just nobody it's why he's so popular yeah I I've maybe watched like 20 minutes of Sam Sulek I appreciate like the content
Starting point is 02:51:03 yeah like that the the world that he's built but because specifically I remember being in that mindset of like I was kind of like again I remember I was like 19 I was in school I wasn't that happy. I was, like, living in Santa Barbara, which was, like, paradise. But, like, mentally, I was chasing that, like, I need to, I'd be, like, those scenes where you're, like, driving, I'd be driving to the gym at, like, 5 a.m. Yeah, I can relate with everything you're saying, like, down to the specific. Like, I started lifting, too, like, sort of, like, I was in school, just moved.
Starting point is 02:51:33 It didn't really have friends. Yeah. Would, like, go to the gym at 11 o'clock, you know. Yeah, yeah, weird hours. These early, early mornings or late, and then getting, like, food afterwards. and like honestly that era for me like future uh the the rap artist was like absolutely peaking what year was this monster for me yeah yeah that era was just out yeah yeah so so i we might we might be the same guy i became a man what's your wait is your full name jordan yeah
Starting point is 02:51:58 wow this is brother make it make it make it make sense what's going on here but uh but yeah so i would i would drive i i i just remember like at the time i was driving a uh uh uh uh uh uh uh uh uh a Prius with like 200,000 miles on it. And I'd just be like, driving to the gym, just blasting future, drinking protein shakes. And that's like when I think I became a man. Yeah. Like, incredible. Incredible things. And so I want, I want, I think it's super important. And like, when I hear, when I hear that a young person is getting into going to the gym, I'm like, great. Totally. Like, you're going to become, you're going to find yourself through that. And my only thing is
Starting point is 02:52:39 like I think at some point you have to evolve beyond that and take the learnings and that like that mentality that you get from going to the gym and apply it to other other ways. I think like the big takeaway that I learned early on was like early on I was obsessed with like how many how many sets am I doing? How many reps am I doing? How am I, you know, just approaching different lifts? How many different exercises should I be doing in a lift? And I very quickly realized that there was like two fact there was maybe three factors it was like was i sleeping a lot yeah was i eating a lot and was i like hyper consistent and doing a lot of volume and those are the same lessons that i've brought into my real life outside of the gym it's like am i sleeping a lot
Starting point is 02:53:23 am i like nourishing myself am i getting the right amount of calories am i getting you know getting the right amount of um uh just like the right kinds of food and then am i being consistent in the work And so we apply that every day. We come in and do the show. And you also go to the gym. I mean, I don't know. Behind the scenes look, the whole squad was at the gym this morning, which I think is a great thing that you guys do.
Starting point is 02:53:46 Yeah. A lot of people that were on that chart, a lot of the tech elite are deeply unpopular people. I think Elon was polling as less popular than Donald Trump. So not just controversial, but actually just unpopular across the board. A lot of tech CEOs, a lot of tech people are just unpopular in America. obviously we love them here but we're weirdos about that but my question is like when I think
Starting point is 02:54:11 about who's really popular I think Tom Cruise I think George Clooney I think people that are in shape is there something you know we we do this jokey like gigacadification of these people um is there something where you like like it would actually be in their best interest for the tech elite to get a lot look at Bezos I was going to say like that um you know I I used to think Zuck was a full creep He was like a beady-eyed sort of bug. He would like, when Zuck would look in the camera, you could almost like hear him blinking. You know what I'm saying?
Starting point is 02:54:41 It was, and then he got jacked and threw on a chain. I was like, this guy's not so bad after all, you know what I mean? So maybe they would all benefit from them. PR teams hate this one simple trick. Well, PR teams should love it. No, they should love it. But they maybe aren't awake to it for some reason. Yeah, look, if you're on a billionaire's PR team,
Starting point is 02:55:01 step one. Step one. Hire a trainer. Bring the trend twins in. Yeah, yeah, bring the trend twins in, get jacked, and, you know, then the public opinion. Why do you think health is so political? Oh, man, I did an event two nights ago here, and someone asked the same question, and this guy in the audience stood up, because I actually don't know.
Starting point is 02:55:27 I mean, one reason is that I think that when you're jacked and you're, you're strong and you're, you know, or you're competent, it acts as a judge, actually. Because if you're like, you know, fat and you're lazy, then you feel just sort of implicitly judged. The other thing is that going to the gym and getting jacked implies a certain value system. You know, there's like, you know, if you lift 145, you know, that's less than 225. Sure. And if you're in the gym, you're trying to get stronger, the implications that it's better to lift 225, you know? And so, like, a lot of people who are, like, dogmatically committed to a certain kind of, like, egalitarianism don't like that.
Starting point is 02:56:03 But the guy at the event stood up, this guy, he looked just like Joe Rogan. His name's Anthony, actually, shouts out Anthony. But he was like, he said that, he said that generally speaking, the left tries to impose their ideas onto nature. And the right starts with nature and discerns their ideas from there. And so there's something just sort of fundamentally different about that. And I think lifting sort of falls into the latter camp. I have a question. Yeah, I'm always surprised at how much, like, there seems to be more controversies within,
Starting point is 02:56:30 in between like various people even that consider themselves all to be pursuing their ideals of health it's like everybody like kind of picks it lane in a category and then it's like they're fighting over like green powders what green powder should you take oh my green powder is better than your green powder it is universal like it i feel like it's it's hard to make a political issue out of something that doesn't affect everyone because you just keep bumping into people and if your whole thing is like you know corporate cards or something or like you know the the manufacturing supply chain for large gongs it's like there's a lot of those right yeah yeah even have the you got you I noticed in the bathroom the picture of the jacked guy the shirtless guy hitting the gong I think
Starting point is 02:57:13 these guys yeah yeah yeah it's like it's hard to get everyone animated about that it's a niche interest whereas like health care how much money you have your job your health your food like these things are universal so anyone can relate to them and I can give you a take and you can bounce off it immediately instead of being like I don't really know anything about that or that doesn't really affect me ever yeah but there's also there is so much more people die in this this actually I actually don't know if this is true but you see this talking point all of them more people die in this country from like obesity and from starvation you know there is this total cope around around being unhealthy and people people don't like the idea that there's actually something
Starting point is 02:57:50 you can do about yeah yeah yeah it's just like everyone everyone can be obese or not, like, not everyone can have a, like, a large corporation that goes bankrupt under their stewardship. Like, so it's just less relevant to them because, like, they might not be the CEO of a company that's going bankrupt or something. Pickleball. There's, uh, I couldn't find it in here, but there's a, in the mansion section online. There is a house that is up for sale right now that has a pickleball court outside and a
Starting point is 02:58:19 pickleball court inside. What's your take on pickleball? dude it's fun I hate to say it but it's fun I mean I played it one time with a friend and I was like man this is I probably said some words and then we played it
Starting point is 02:58:36 and I was like this is awesome so I don't know you played inside or outside outside I mean two courts is a little bit you know I don't know if you love something Zoom out for me build the ultimate mansion for athletics
Starting point is 02:58:51 Are you going all in on one particular athletic endeavor? Are you making sure that there's a pool and a basketball court, a tennis court? What's important to have around the mansion? For me, definitely a basketball court. A basketball court. And then like a gym in a garage with no AC. No AC. And what's in that gym?
Starting point is 02:59:11 Are we doing rogue racks? Are we doing? Rogue racks are good. Okay. Yeah, yeah. Free weights? What is it about what's the archetype that can make a home gym? really work. Do you have to be, like, spiritual? Like, every time I've just been in a debt,
Starting point is 02:59:26 like, during COVID, I bought a bunch of weights and I was working out of my garage. And I just got so small. Hard to be motivated. Yeah, I got, that's a skill issue, right? I'm sorry. No, no, I think I'm implying it's a skill issue because I would be going in there. And I was at home so I could get an email and I'd be like, okay, I'm going to run in and respond to this email on my computer because it's easier. And if I was taking a more monk-like approach, I'd probably have been getting even more jacked because there's no distractions, right? It's just like more focused. Well, during COVID, I mean, we ordered a bunch of weights, me and my lifting buddy, like right when, like right at the tail end, because everything sold out like
Starting point is 03:00:04 super fast. Yeah, but you still had a buddy. That helps. Had a buddy, yeah. And, um, and we, we ended up getting these like dinky, probably like 25 pound barbells. Okay. These like shitty weights or whatever. Um, but I would just go and I had a shed at the, at the house. And I would like go in the shed and throw on music. And it was actually the first time that I was lifting where I understood the power of like shrieking and screaming while you lift. There's a lot of power. Yeah, yeah, yeah. You can, yeah, yeah, yeah. The modern man does not shriek and scream. What's in the supplement stack? What's Lindy? What's interesting? What do you stand away from? Are you beta alanine? Are you use creatine, protein supplementation, mass gainers? What do you like? I'm feeling reactionary against the
Starting point is 03:00:48 optimizers. say nothing. I mean, creatine, for sure. Protein's fine. But I think you could basically just eat. Raw milk. I do like raw milk. I drink a lot of raw milk. Recently I gained, I went from like 165 to 200 pounds over the course of like three months. And so I was drinking like a gallon of raw milk a day, a tub of Greek yogurt a day. Yeah. The raw eggs, I do think like drinking, you know, 12 raw eggs makes you feel amazing. But that's kind of the, I went through raw egg. I went through a phase where it would be like five eggs mixed into raw milk, just raw. Were you bodybuilding.com guy back in the day?
Starting point is 03:01:27 That was a long, that was a long. I was like, I wasn't even like a phase. I would just, that was just me for a long time. I realized like, wait, I don't have to scramble the eggs. I can just put them in a jar and just drink it. I would, I didn't like the taste. I would kind of almost throw up sometimes, but it was just so effective. I mean, Derek from our place, more day, it talks about just drinking the full carton of liquid egg whites. He does. Yeah, yeah, yeah.
Starting point is 03:01:52 Just chugging the carton. Yeah. That was a good era. There was a period where I was watching his videos, like, when I was first getting lifting, watching, like, I remember being like, I can't believe I'm watching an hour and a half videos about some. It's crazy, but it's so meditated. Well, it's like what we were saying at breakfast. It's just amazing to watch anyone who loves what they do. Master of craft, yeah, exactly. Makes sense.
Starting point is 03:02:09 You would have appreciated we gave a talk last year when we had done like a few episodes of the podcast. And then we gave a talk at Hereticon of, like, the case for, like, founders getting on, like, PEDs. Because you have, like, when you think about, like, pro athletes are, like, spending all their time trying to be healthy, sleeping well, working out and being as strong and as physically fit as possible. And they legally or cannot take PEDs. Although, if you ask them, like, if you could take PEDs, if it was legal, would you do them? They'd be like, yeah, absolutely. I want to go to the next level. meanwhile in the business world you have people that are not sleeping that well they can't train
Starting point is 03:02:48 that much they're not that healthy they have like kind of not that great hormone profiles in a perfect world they just start sleeping well and training and eating well and getting on track but for a lot of people for your CEO busy CEO you're scaling your company you don't have like you some of these people like can't prioritize it or they don't for whatever reason and like I genuinely think if they just got on PEDs they might build they might build the build the you know of course, under doctor's supervision, but they might actually, like, get motivated in order to build some of those healthier habits,
Starting point is 03:03:20 but at the very least have a lot more energy and bring that intensity. What's your take on PEDs? I don't do them because I haven't had kids yet, but they do seem probably good. I don't know. Have you done them? Actually, although I, actually, one of my buddies from Cleveland was having a hard time on,
Starting point is 03:03:39 I won't say his name, he was having a hard time on the dating apps, and he was like, no one's responding, you know, whatever, like, and he started blasting gear because he thought it would help him get girls. And I was convinced, yes, immediately. Whoa. And I was like, but he didn't even get jacked in the meat. It was like, it was literally immediate.
Starting point is 03:03:55 And I was like, was it a mindset thing? I think it might have been a mindset. Sure, sure, sure. Yeah, yeah. He's like, I got, I have, I've solved the problem. Yeah, it was all my problem. So I can bring the confidence. It worked for him like the next day.
Starting point is 03:04:05 I find, I find dating. Dating. Dating. Last year. What's that? I find creatine genuinely very. Yeah, yeah, I take reason. I also find caffeine extremely performance-dancing in the gym.
Starting point is 03:04:17 Yeah, if you can, like the Celsius phenomenon, I think is very real, like 200 milligrams of caffeine, getting actually very fired up before a workout, genuinely helps you stay focused during the lift, push you a lot farther. Yeah, we want to make the beta alanine for the workplace, you know, imagine you got to do some emails. Is that the tingling? Yeah, you start tingling. They're like, I got to get through these emails. I mean, I will say that the, yeah, for sure.
Starting point is 03:04:40 I guess the supplement stack is like. like drinking enough caffeine to, like, kill a small child. It's really good. Not that I'm going to kill a small child, but that would if they drink it, kill them. These mattses are really good there. Are you guys sponsored by these guys? We're not, but we're just friends. That's Andrew Huberman's brand.
Starting point is 03:04:55 Fuck. God damn, sorry, man. Hey, look, look, look, I love your product. If you, you know, um, talking trash, but I don't know, you're a good. They're good. I don't consider Andrew's philosophy, like, about over-offination. Nothing you said is wildly disagreeing. Yeah, but yeah, I, I, I, I,
Starting point is 03:05:12 I think it's more about, like, understanding, understanding yourself and the various input so that you can then make calculated decisions. Like, I still, I know how important sleep is. There's plenty of nights. We were traveling on Tuesday, didn't get a lot of sleep, made a call to try to get some more sleep, but ultimately I think it's just about, my view is like,
Starting point is 03:05:35 as much as possible, try to understand how the body works, but at the same time, you need to remember that the the real edge is like having like a fiery like spirit like and just being like that that is like
Starting point is 03:05:53 that's the thing that you want to cultivate right yeah there's that yeah I mean I yeah I maybe part of the my feeling about it is that the one time I tweeted something negative about Andrew Heuberbin I got like ratioed into oblivion you know people calling me gay and it gets like 5,000 like You know, and I'm just, so, so, yeah, they came after me hard, but you're probably, you're probably, you're probably right, you know, I don't know.
Starting point is 03:06:17 But then, but the, the counter argument to that is like that. Makes a delicious Yerba Mata. Well, yeah, he makes a delicious, uh, Matina Yerba Mata. He makes a great, great product. And actually, in my first novel, there's a long passage about, uh, how Yerba Mante is great. Oh, really? Yeah, yeah, yeah, in the novelist. But, um, I guess the counterpoint to the, to the sleep thing that you're saying is that, um, there's that for a while when, like, the Tate brothers were just
Starting point is 03:06:41 all over the field. Is that one of Tristan being like, you know, I'll go to sleep at 6 a.m. and wake up at 5.59 a.m. and, you know, one minute before 6 a.m. And I don't need any sleep. And like that, that's the kind of counter argument. That's what you want to do?
Starting point is 03:06:54 I mean, it's just, it's a, it's an admirable vibe. It's the attitude. Yeah, yeah, yeah, yeah, yeah. I always get it, I always, I always do beef with Brian Johnson about this because I'm just like, don't you think that if you just have like great well and a life's purpose, you can live forever even if you're super unhealthy.
Starting point is 03:07:13 And I always cite like Charlie Munger and Warren Buffett who like eat McDonald's and drink Coca-Cola all day long and like live to 99 and 100 and just like, you know, they don't look jacked. They don't really do any biohacking. But they seem to be extremely sharp. One thing we probably agree on is like I don't think you should let health trackers like decide your mood.
Starting point is 03:07:33 Totally. Or decide like how you're going to approach it. Oh, totally, totally. It's like I wake up some days. I sleep on an eight sleep. I have like a 50. No, the tracker is a. It's still going to the gym.
Starting point is 03:07:42 It's not telling me what to do. Well, and it's just, yeah, it's just a, it's a little friendly. It's not tracking me. I'm tracking it. You're tracking it. I'm tracking it. But then there are also studies that are like, you know, the people that live the longest have like, you know, they have like good relationships.
Starting point is 03:07:55 Yeah. They have some sort of spiritual life. You know, it's like those. Yeah, that stuff is very unquantified right now in the quantified self movement. And it needs to be more included, I think. Yeah. That's why I don't think you can really like quantify that stuff. People are really uncomfortable things you can't quantify.
Starting point is 03:08:08 Totally, totally, totally. But I think probably, yeah, but I think probably, yeah, good relationship, some sort of spiritual life, general health, you know, eating the things that like you're, because also different people, different, you know, diets work better for different people's and stuff like that. Totally. Yeah. And what's your actual, uh, routine? This is, what's that? What's that? Are you a bro split? Push pull legs. What's you doing? The manosphere split. Just crushing fentany. Um, um, um, um, um, yeah. Not beating the manisphere allegation.
Starting point is 03:08:36 Yeah, yeah, exactly. Lately I was doing a mad cow, I think it was called. It was just three days a week, and it was literally just... Go crazy. It was just powerlifting. So it was like squat, bench, deadlift, barbell rose, three days a week, five by five, three days a week. So you do all of those every day? No, no, no.
Starting point is 03:08:56 All of those three days or you rotate through them? You rotate through them, but you're swatting three days a week. Okay, wow. Benching twice a week. You're doing overhead press two. Mad cow. You're adding five pounds a week. So it was like, and that was when I was doing the spirit bowl.
Starting point is 03:09:10 And five by fives. Yeah, exactly. I like the five by five. That's good. That's a good. So I was doing that recently. I do like, I mean, the thing that got, I'll, I'll, I, I, the thing that maybe we're just holding back a little bit of our progress is like we want to work out every day.
Starting point is 03:09:27 And there's actually like studies that show if you just take more rest days, you can put on, You can just like start. But the problem with that program is like I also like lifting every day. Yeah. So it was like a problem. Like I was like it was actually not the most enjoyable thing. I do like as much as I'm like opposed to it in theory. I do like Jeff Nippert's programs.
Starting point is 03:09:46 Oh yeah. Power building program. I like him a lot. Yeah, they're very good. Yeah. So the power building one and power building three. Yeah. I think my favorite program's up down.
Starting point is 03:09:55 Yeah. I just like the way he like structures his like lessons and stuff. Well, he's also natural, which is like a big deal. You know what I mean? A lot of people who make these. programs aren't natural so it won't work for guys that aren't sure sure sure yeah makes sense yeah they're like do 10,000 reps of this the rich piano eight-hour armwork I'm on I'm on enough trend to kill a horse yeah yeah yeah yeah that's fantastic
Starting point is 03:10:19 well thanks so much for chatting what's your where you have your next book in the works uh yeah yeah yeah yeah it's gonna be a thriller can you share the prompt what's the prompt No, no, I can't, because the, the, the, the, the, the, the, the, the, the, the, the, the, the, don't make mistakes. No, no, I'm not, I'm not, I'm not, I'm not going to do it. But thanks for having me out. Make sure that. Mussel man is the current one. You guys can buy it.
Starting point is 03:10:38 Uh, what about the biggest fish you've ever caught? Dude, I just, I never. You've never been fishing? No. No. I mean, my family has, no, dude, I was too punk. I was too much like he was a ski. You've never shot a fish in a barrel?
Starting point is 03:10:50 Nothing. Nope. Nothing. Okay. Well, we'll have to change that soon. Uh, yeah. The outfit looks like it could maybe transform onto a fishing boat and fit in. It's funny.
Starting point is 03:11:00 That's funny because I wore this the other day for an event. And my buddy was, I told my friend that I was coming on this show. And he was like, you're actually kind of dressed like he thought the vest was somehow TBPN. Also, Brandon Content over there has the same vest. He told me about, he told me that he has it. He's looking sharp. Well, thank you so much for coming on the show. Leave us five stars on Apple Podcasts on Spotify.
Starting point is 03:11:21 And if you're listening to this and you haven't lifted today, hit the gym go for getting a lift visit the church of iron that's right have a great weekend everyone we love you goodbye see you Monday

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.