TBPN Live - Intel Rips, Thrive Launches Eternal, GPT 5.5 | Diet TBPN

Episode Date: April 25, 2026

Diet TBPN delivers the best of today’s TBPN episode in 30 minutes. TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays 11–2 PT on X and YouTube, with ea...ch episode posted to podcast platforms right after.Described by The New York Times as “Silicon Valley’s newest obsession,” the show has recently featured Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella.Follow TBPN: https://TBPN.comhttps://x.com/tbpnhttps://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235https://www.youtube.com/@TBPNLive

Transcript
Discussion (0)
Starting point is 00:00:01 The big news, of course, is Intel. Intel is absolutely ripping. Intel jumped 20% after hours on the back of $13.6 billion in Q1 revenue. Only 11% above analyst estimates, but there's five key factors that are coming together to create a new narrative around Intel that are driving the stock much higher. And it is almost an old-time highest. The main factor is bubble boy on X. A lot of people have called this, and it is long overdue. who's only up 7% year over year, but next quarter is already guiding to be better, somewhere north of 14 billion probably. But there's still big losses under the hood. This quarter, they lost $3.7 billion, not good, but that was mostly driven by one-off charges related
Starting point is 00:00:50 to Mobile Eye and derivative payments tied to the U.S. government's 10% state. So if you strip those out, Intel actually earned $1.5 billion, which is much better than what people were expecting, which was break-even. So the cruise ship of Intel is starting to turn somewhat, but the narrative has already completely shifted. So Intel is working again is the idea. The AI trade has mostly been Nvidia. Invidia's memory suppliers, TSM, power equipment, cloud CAP-X, and a few software names that can prove real adoption. If you're accelerating your top line, you are an AI winner. That's sort of the rule of thumb. Of course, things might play out differently, but that's what's been happening in the market. Intel was the most embarrassing missing piece. Why isn't Intel
Starting point is 00:01:34 booming when we're in a computing boom? It made no sense. But there were good reasons, and we'll go through them. The company that invented the modern CPU era, they missed mobile, they fell behind TSMC, they failed to produce a competitive AI GPU. They did fab a GPU at one point. They tried to get into gaming at one point, but they never really found traction, especially in the data center and the servers for AI training. And so they have fallen behind. And then they spent the last few years trying to convince the world that Intel could still matter. But now, oddly enough, the rise of AI agents, it's giving Intel a second shot. Do you have thoughts on Intel yet?
Starting point is 00:02:10 I'm going to keep going. It's funny because at the moment that the U.S. took a position in Intel, it was like it felt like a bay lab, right? It felt like that, yeah. And it felt like you would expect, okay, over time, this is going to be very, very good for Intel to have that. vote of confidence, but I don't think anyone was really predicting that it would go up quite this much in the span of just half a year. I think Ben Thompson wrote a pretty strong bull case for the, for the Intel US deal. Basically, like, you had to, you had to grapple with this idea of, like, this doesn't feel like free market capitalism. Yeah, I guess, I guess the only thing is, like,
Starting point is 00:02:51 there wasn't, there wasn't this narrative around CPUs at the time that the deal was happening, or at least it wasn't a very public narrative. Yes. Yes, the bull case at that time that I heard most loudly was not the CPU boom. It was just overall chip. It was the 14, like the leading edge fab that basically by having the U.S. government as a shareholder, you see those dinners with all the AI lab leads and all the hyperscaler CEOs. And there's a world where there's another one of those meetings. And the government administration says, hey, we are backing Intel.
Starting point is 00:03:24 I need all of you to commit to buy, you know, a ton of supply if Intel actually goes and builds the leading edge fab, which is going to cost them billions of dollars. But if they have the demand guarantees, then they will actually be able to go do it. Tyler, what was your take? Yeah, I think Leopold, or sorry, Intel has long been kind of the Leopold take, which is that, like, you know, this is the clear, like, this is the company you want to own in the kind of nationalization world of, like, okay, Taiwan risk. Oh, sure.
Starting point is 00:03:54 What are we going to do if we can't make chips, you know, in Taiwan? Intel's like the very clear strategy. Yeah, maybe it's like not working right now, but like at some point this is not just like economics. This is like, okay, this is, you know, national security. Like, it's extremely important that we have this capacity in the U.S. And Intel is kind of the only one that's even like remotely in the conversation. Yeah, yeah.
Starting point is 00:04:14 No, that makes sense. There were some predictions from AI 2027 and other folks in the AI forecasting world around the rise of agents, that agents would arrive at the things. time. I didn't see too many folks who were predicting strong, valuable, effective AI agents really predicting the CPU crunch, but that's exactly what's happened. So AI agents need CPUs to do things. Training frontier models, that's still a GPU story, but running agentic workflows across data centers orchestrating tasks, routing jobs, managing memory, handling inference workloads, coordinating servers increases demand for the boring old central processor. Intel's data center segment produced
Starting point is 00:04:56 $5.1 billion in quarterly revenue, beating the $4.5 billion that analysts expected. And Intel's CEO, Lipbutton, said, the next wave of AI is moving from foundational models to inference to agentic AI. And that shift increases the need for Intel CPUs, wafers, and advanced packaging. On the earnings call, he said that the CPU to GPU ratio is closer to to one CPU for every four GPUs versus one for every eight in prior years. And there is an interesting forecast from Lipbuton as well. But VK Macro says CPU to GPU ratio flipping from one to eight to eight to one is absolutely wild. That's just a completely new world to what we've had so far.
Starting point is 00:05:39 This is from Evercore ISIs Mark Lepieces, has upgraded Intel directly from neutral to outperform and mentions that as AI workloads shift further toward inference and agents, the weight of CPU demand will rise sharply, and the CPU to GPU ratio could flip from 1.1 to 8 to 8 to 1, which is a massive, massive switch. Yeah, I mean, I don't know, like, that's kind of a crazy ratio. It is. Because, like, the less of the recent from tier models is that, like,
Starting point is 00:06:10 the models are going to keep getting bigger. You're going to need way more, like, chips to him and some. Like, Dylan Patel was on Patrick O'Shaughnessy. I think it came out yesterday the day before, and he's just saying, like, you know, the models are going to get really expensive. Tocons are going to get super, super expensive, people are going to price down.
Starting point is 00:06:25 So I don't know if I fully believe the narrative that, like, you need all the CPUs. Yeah. You're going to need way more CPUs than... Well, I mean, the... It really lines out perfectly. His camera is the main one that fits perfectly. The glasses land.
Starting point is 00:06:42 I love a Friday show. Yes, that's a good point. But there is this... I think at least in the midterm, like the... Yeah, there's like... Capabilities overhang. And just, like, one really smart AI system that's being inferenced on GPU
Starting point is 00:06:55 can spin off a ton of CPU workloads and do a lot of things that require CPUs. Yeah, like, even in the SaaSpocalypse, like you vibe-coded Slack or whatever, it's like, well, that vibe-coded system is running on a CPU, and if it took you, you know, like, I don't know, a thousand GPU hours,
Starting point is 00:07:14 but then that system runs on CPCU, CPUs that are running constantly for every user, you still wind up with creating more CPU demand out of the GPU. I think this take makes a lot of sense if you are like kind of, you know, long open source. We're going to have a lot of these open source models. Yeah. They're quite small. You can run them on like a small amount of chips, but you just have a lot of them.
Starting point is 00:07:34 You have a bunch of agents doing a bunch of stuff at the same time. Rather than the single model, you know, inference on a massive amount of chips. Why is the horse here? Anyway, whether it's going to eight CPUs for one GPU or staying at one CPU to four GPUs, it's still twice as good as it used to be, because it used to be one to eight. And so that is bullish for Intel and Libbutan is making that clear to the market and to the investors on his earnings call. Then there's the wild card, the TerraFab, Elon Musk project. It's very exciting, but it's clearly further off.
Starting point is 00:08:09 Elon Musk wants to build a massively vertical integrated chip manual. manufacturing operation with Tesla, SpaceX, and possibly other Musk companies needing huge volumes of chips for self-driving cars, humanoid robots, and even space-based AI data centers. Intel is supposed to help design, manufacture, and package chips for the project. Now, the Wall Street Journal has a more cautious view today. Elon is aiming for TerraFab to reach 100,000 wafers a month, and then eventually 1 million wafers a month, which is just insane scale. So let's put it in context. One million wafers a month is about 70% of TSM's total monthly output across all fabs. TSM's largest fabs put out roughly 100,000 wafers a month into production.
Starting point is 00:08:52 So you're talking about 10, you know, leading class TSM fabs at Intel just on the TeraFab project plus whatever else Intel is doing. So a pretty massive scale. And TSM's CEO, CCWA, has said that it's just not that easy to build a FAB overnight and get it up and running. So Fabs take two to three years to build, and then another one to two years to actually ramp. And we've seen this with TSM, Arizona, which we've been very excited about, but it just takes time. And there is a bottleneck. And the bottleneck has been discussed at length. Ben Thompson wrote about TSM risk and Sertechery, arguing that TSM needs to spend more
Starting point is 00:09:33 on CAPEX. And I think that should be clearer now. We'll see what happens at the next TSM earnings call because maybe they will be waking up. But overall, the Intel, Intel now has a collection of plausible demand stories, even though the demand itself is not going vertical. The stock is going vertical on the back of these five key demand stories that are all pointing in the same direction directly up. One, AI agents need more CPUs. Two, AI systems need more advanced packaging, a higher ratio of CPUs to GPUs. Intel can help with that. Three, the U.S. government wants to a domestic leading edge boundary. This is just a national mandate. Four, Musk wants an impossible amount of silicon. And five, also the hyperscalers want more supply.
Starting point is 00:10:18 And so all of that is good news for Intel. There are more companies that are getting into the CPU design space. We talked about ARM recently. And Arm, of course, it feels like they will be going with TSM in the short term, but who knows what happens in the long term. So as CPUs continue to be important in the AI story, all good news for Intel. So suddenly investors are more willing to entertain a messy, expensive, strategic chip story than they were five years ago. Intel famously Miss Mobile, which meant TSM ran away with enormous manufacturing volume and left Intel with a demand problem. Volume was destiny, as they say. You can't grow if you can't build the fab. That's on the leading edge and you can't build the fab unless you actually have the customers.
Starting point is 00:11:04 And so if you missed mobile, you just have this gap and you have to jump over it with the help of the government, a bunch of other people and the AI narrative and all these different five key pieces. So the pieces of the puzzle are coming together now. And that's good for Intel. It's also good for America's chip manufacturing prospects. So good news overall. And congrats to all the Intel shareholders that were believers early on and rode the wave. Everyone. Every U.S. citizen. Every U.S. citizen. Congrats to all Americans. Every U.S. citizen.
Starting point is 00:11:31 And every U.S. debt holder. So even the international folks that own treasuries are in a better financial position today. You got to think about what this does to Trump's confidence level. You know, he's like, I enjoy a bailout. He's like, I'll indulge spirit airlines, American spirit. I think he's excited about the potential error. Sure, sure. But why stop it just bailouts?
Starting point is 00:11:58 right? Why not, why not start taking a position in hyperscalers? Like, look, like, I think I can move your market cap by 3x. Yeah. I've done it before. Yeah. And Intel was falling apart. Yeah. Imagine if he applied all his wisdom to a company already winning. Certainly possible.
Starting point is 00:12:16 Jim Kramer's excited about it. He's in 13 months, Lip Bhutan took Intel from a possible and unthinkable bailout candidate to one of the wealthiest companies in the chip industry. There is a big three of CPU, AMD. Intel and arm the and the agents need far more CPUs than these three can produce. So that means prices are going up and they got the God candle. Yeah, look at this candle from a candle. We love it. We love a green candle. Should be white suits today. Well, before we move on, there's some news. Justin Bieber
Starting point is 00:12:47 brought Biebercella back from the dead. Later in the show, we'll tell you what this says about late stage venture reacceleration. We're going to tie those two together. In other news, for more than a year, Silicon Valley has buzzed about cursors growth and whispered about its margins. Now on the cusp of a $60 billion bailout, Laura over at the information. Wait, they called it a $60 billion bailout? Buyout. I'm sorry, buyout. I was like that. We revealed a hot vibe codings financials. 60 billion dollar acquisition calling it a bailout. It's not a bailout. Their margins were rough. We were just talking about it. It got warmed in a lot. We were warmed in a lot. We were
Starting point is 00:13:28 my head. Curser had negative 23% gross margins earlier this year. Amir says that is low for a company generating as much revenue as it is. This is the question of how sticky will cursor be as a entry point to AI, as an entry point to inference demand. Can they reroute? It feels like with Composer, they have been able to retain a lot of customers. The company is still growing, and we know cursor users who have stayed with the product while changing inference based on what their plan gives them. So they're subscriber, and they will use whatever the plan, whatever gets the job done for them within the budget of the plan.
Starting point is 00:14:10 And so obviously, like you see negative 23% gross margins, and you're like, whoa, and then you look at the SpaceX deal and the X-A-I deal and all the compute that's sitting at Colossus 2. And you think, oh, well, like, what will the gross margins be once the inference is happening on a cursor trained model or XAI trained model? Are there going to be higher gross margins? It's pretty easy to imagine that the gross margins would increase. Did you want to play this piece from Patrick O'Shaughnessy?
Starting point is 00:14:42 Let's do it. So clearly Anthropic is in the lead, right? And Open AI is cooked. What's interesting is because Anthropic has such bounds on compute, and they can only grow it so fast and sort of to the point of Daria used to gloat about how Open AI was being too aggressive on compute
Starting point is 00:14:58 and Anthropic was more sensible in their scaling and now Anthropic is like fuck we should have I wish we had a lot more compute Open AI is able to pay the bills perfectly fine In fact they've raised a ton of money to get incremental compute in addition to the irresponsible levels of compute that they were buying from Oracle and Corrie and SoftBank and all these people in Microsoft
Starting point is 00:15:17 such as Traneum now they're getting Traneum as well from Anthrop, Amazon. But what's interesting is, if you were to say Opus 4-6, you know, let's ignore models getting better over time. Let's just take diffusion of this technology. You and I may jump on the model immediately day one, but other businesses take time and it takes time for people to learn. And so by the end of the year, let's say a 4-6 Opus Tier model, the economy would spend $100 billion on. I don't think that's unreasonable. It's spending $40 billion right now. Anthropic won't have enough compute to do that. And so, and presumably opening I and Google will hit that tier soon enough. Whoever hits that tier next, sure, Anthropic may get to charge 70 plus percent
Starting point is 00:15:56 gross margins, but if Open Eye hits it next, they charge 50 percent gross margins. They still get all of this incremental demand, and probably they also won't have enough compute to serve all the users. Sure, maybe Mythos is a model where if the world had enough compute, it'd be $500 billion of revenue or something crazy. There is such demand for these tokens and such limitations on compute, You know, and we see this with H-100 prices skyrocketing, and the useful life of these GPUs continue to extend. It's pretty clear even the Tier 2 lab is going to be sold out of tokens, let alone the Tier 1 lab.
Starting point is 00:16:26 The Tier 1 lab will have better margins, but the Tier 2 lab will be sold out, and probably the Tier 3 lab will also be close to sold out. Economic value that the best model can deliver is growing faster than our ability to actually serve those tokens to people via the infrastructure. And so this gap will continue to grow, and the model labs will continue to have expanding margins, until people in the hardware supply chain,
Starting point is 00:16:45 infrastructure supply chain. Like, wait, no, why don't I just jack up my margins? Oh, yeah. I love that sound. I love that sound. Bill Gurley, let's see what Bill Gurley. Let's check him with, BG. He says, I find this conversation foreign, along with the argument that we are
Starting point is 00:16:57 data center constrained or energy constrained. Historically, in markets, price is a leveler of supply and demand. If you have a constraint, you price higher. You don't have surplus demand. Yeah, it's a good point that they are, the big labs are losing money. But it doesn't feel like that the revenue of Open AI and Anthropic are VC subsidized heavily. They're not VC-B-C-B- Yeah, that's what I'm saying. And their customers are not startups necessarily. It's like hedge funds and banks and stuff. So, uh,
Starting point is 00:17:23 hedge funds and banks that are printing by betting on semis right now. Yeah, which is a different, which like, like, I, I, I'm totally, I'm totally on board with like, well, uh, you know, the, the, the, the, the, the, the, the, the, the, the, the, the, the, the, the, subsidizing demand, right? Uh, you know, the, the, the, the, the, the, the, the, the, the, the, the, But the idea that BC dollars are the biggest drop in the bucket of subsidized demand feels like, I don't know, it just doesn't quite math out to me. The actual revenues and demand from Fortune 500 companies is so high. And the really big dollars in all of these rounds. I mean, Anthropics just raised something like 40 from Google today.
Starting point is 00:18:09 And that's not a VC subsidy. I mean, it is a subsidy. Yeah, I'm surprised. I'm surprised he's not talking about, like, big tech subsidies. Yeah, yeah. Maybe he is using the term VC dollars, like, broadly, which is, like, fair. So if he's using the term VC subsidies, VC dollars broadly, I think that does sort of, it is a reasonable point. But at the end of the day, you talk to most companies and most, you know, paid chat subscribers, and they're just like, I like the thinking model. So I pay $20 a month. I like the pro model and I use it for to create value.
Starting point is 00:18:45 I'm getting that much value in like software for my business that probably doesn't even exist. Like all the vibe coded software that we use here. Like it's just not available off the shelf. We're building like completely net new products. And I think that that's like generally what's happening in in the in the AI economy. There's been this like discussion for a long time. Martin Scully was sort of debunking it around like, oh is this like are these all like circular deals? And he was like no, like the economy is.
Starting point is 00:19:11 so broad that you have, you know, come, yes, there are, yes, like Google might take a position in Anthropic, Microsoft might have a position in Open AI, but you also have completely Main Street customers and just average Joe's who are paying, seeing ads, buying things, there's companies that are profiting off of running ads and other inference providers. Like, there's like, everyone's, it's one big circle. It's, yes, it's one giant circle of life that is, you know, actually self-perpetuating because it is a true economy that hits, you know, 25 different categories. Well, you might think it's over, but Elad Gill does not.
Starting point is 00:19:53 He says, my view, is the AI boom will only accelerate and is a once-in-a-lifetime transformation. This is orthogonal to whether many AI companies should exit in the next 12 to 18 months, as some may lack durability versus labs, new entrants, or weird, market ships, but he's extremely bullish. He also posted a funny thing about his new life plan. He said, his new life plan is to move to Brooklyn, get a neck tat, ride a bike everywhere, cold brew his own coffee, also start drinking coffee? That's an odd thing to jump straight to the course. Was this, was, was, was, was he's like trying to, is this like a cipher of some sort? He wants to adopt a 12 year old straight. Is there a secret, is there a secret message embedded in this post?
Starting point is 00:20:37 Maybe. It sounds like he's advising his portfolio companies. Yes. I don't care how hard you're ripping. A lot of you should probably exit. Yeah. Just given how much uncertainty there is. Well, later in the show, we have a fun story.
Starting point is 00:20:53 A bear wandered into a backyard and took a dip in the family pool. We'll take you through it. Coming up after later in the show, what this tells us about edge computing. Get ready for it. Has Jane Strzschilder? achieved AGII internally? I think so. They definitely had...
Starting point is 00:21:11 They did $40 billion in revenue last year, more than all the big Wall Street banks and with only 3,500 employees. Yes, let's give it up for June Street. Fantastic. We have some exciting news out of Thrive Capital. Josh Kushner announced a new fund, Thrive Eternal, a permanent capital holding company
Starting point is 00:21:30 that will be concentrated in a small number of assets that we can own and steward over many decades across the have capital and Thrive Holdings. We are building and investing through a moment of exponential change backing emerging technologies, the infrastructure that powers them and the businesses they can transform. Increasingly, he sees a fourth category. These are assets with qualities that cannot be replicated by technology, iconic franchises and cultural institutions rooted in tradition.
Starting point is 00:21:55 And he says his first partnership is expected to be with the San Francisco Giants, an institution built on more than a century of shared identity and community. and among the most iconic sports franchises in America. I looked it up before the show started the San Francisco Giants, if you're not familiar. It's an American professional baseball team based in San Francisco. Baseball is a bat and ball sport played between two teams of nine players, each taking turns, batting, and fielding. The game occurs over the course of several plays, with each play beginning when a player
Starting point is 00:22:32 on the fielding team called the pitcher throws a ball. ball at a player on the batting team called the batter, and then the batter tries to hit it with a bat. And so they play this game, they sell tickets to the game, and that's how the San Francisco Giants make money, and Josh Kushner with Ryan Turner is getting in on the action. Yes. For whatever you just described, I feel like it would be very cool. That would be cool. If you had maybe a couple hosts, you know, on-screen graphics and they could provide kind of
Starting point is 00:23:02 live coverage, maybe wow that. Extremely educational. This is a crazy story. Brandon Guerrille and our team was obsessed with it. A US Special Forces soldier involved in the capture of Venezuelan President Nicholas Maduro was arrested for allegedly betting on that operation, netting him $400,000 in profits. This is when the betting on yourself meme goes wrong, right? You should always be betting on yourself, but not literally if it's against the law. If you are involved in a government capacity, you should probably not be...
Starting point is 00:23:33 probably not be gambling on the outcome of your government work. Shame on you, but still an interesting story. Even Trump was like he was betting on himself and then he like referenced on baseball player that isn't in the Hall of Fame because he was betting on himself, but Trump was seemingly implying that that wasn't so bad. People are really, really having fun with the new image gen model. Oh, this was so, this was so rude. But this guy's effectively playing Minecraft through chatypte because he has it generate him an image. And then he says, okay, walk closer and it just generates a new scene. I'm sure there's some kids out there that have figured out how to play Minecraft in chat GBT.
Starting point is 00:24:17 But this is very funny. I found this post when it had 40 likes this morning. Sniped. Sniped. Invested early. Cohere is merging with a German AI lab called Aleph Alpha. biggest fan of Cohere. I love Aiden Gomez. Transformer Paper alum, death grips, enjoyer, what's not to like about the legend Aidan Gomez? We got to get him on the show. But this is a
Starting point is 00:24:44 congratulations moment for Cohere and A-LF Alpha. GPD 5.5. Yes. Kitts is saying it might be AGI because if you ask it, bro, the car wash is five minutes away. Should I walk to it or go by car and says car, bro, it's a car wash. Let the plot make sense. Wait, what? Like it just immediately clocks it. Oh, I like that it's, I've seen this exact test done in normal language, like with a proper prompt, not the casual bro slang.
Starting point is 00:25:17 And it actually is more, it feels more like AGI if it is able to pick up on the lingo and mirror that back, car, bro, it's a car wash. That's very funny. That feels very remarkable. It also got the R's in strawberry correct, right? And how many strawberries in the letter R, zero? The letter R is not known for its berry storage. These are good answers.
Starting point is 00:25:41 The being a helpful assistant, but simultaneously rejecting nonsense, is particularly been difficult for LLM. So good progress here. We'll figure out more about what went into this and what the next generation of AI impacts are from this. Paj says, is found a post from five and a half years ago. Elon says the rate of improvement from the original GPT to GPD3 is impressive. This rate of improvement continues.
Starting point is 00:26:09 GPD 5 or 6 could be indistinguishable from the smartest humans. Just my opinion, not an endorsement. I left opening I, 2 to 3 years ago. I am a neutral outsider at this point. Greg says, thank you. A lot has changed, but he was accurate in his prediction. Funniest outcome is the most likely. And then the last thing I wanted...
Starting point is 00:26:31 They wind up between best bros and do a buddy cop film together. The last thing, there is a new app from X. Oh, yeah. X chat. X chat. And I'm excited to try this out because the current... Yeah, the DMs on X have been pretty brutal. Yeah.
Starting point is 00:26:47 Right? I haven't had that much of a problem. I do have this weird bug where when I click the chat button on desktop, It sort of loads and then it resorts after it loads. And sometimes if I haven't been in it in a long time, it needs to reshuffle several, several times. And so that can be a little jarring. But overall, it seems like they made the migration
Starting point is 00:27:10 to encryption. I haven't, I don't know, I seem to get messages. It seems to work fine. It's cool that there's a new app. I probably need to be better about answering DMs. I have a lot of them that are unread, so having a dedicated app for that. That sounds cool.
Starting point is 00:27:25 People were taking shots because the everything app's supposed to be everything in one place. Instead of three apps to get everything in the everything app. Sort of silly. Who cares? More apps the better. I like apps. So go download it. Go check it out and go like McKita's Post because he's been on a generational run.
Starting point is 00:27:42 Also, GPD 5.5 is available in the API now. That's breaking news, I guess, from Craig. So anyway, we will see you on Monday. Have a great weekend. Enjoy the weekend. Leave us five stars on Apple Podcast. It's Spotify. for a newsletter at tbpb.com
Starting point is 00:27:57 and we'll see you tomorrow. Flashbang. Have an incredible weekend.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.