TBPN Live - Will AWS Buy Google’s TPUs, Remembering Claude The Gator, Ricursive Raises $35M | Diet TBPN
Episode Date: December 4, 2025Our favorite moments from today's show, in under 30 minutes. TBPN.com is made possible by: Ramp - https://ramp.comFigma - https://figma.comVanta - https://vanta.comLinear - https://linear.a...ppEight Sleep - https://eightsleep.com/tbpnWander - https://wander.com/tbpnPublic - https://public.comAdQuick - https://adquick.comBezel - https://getbezel.com Numeral - https://www.numeralhq.comPolymarket - https://polymarket.comAttio - https://attio.com/tbpnFin - https://fin.ai/tbpnGraphite - https://graphite.devRestream - https://restream.ioProfound - https://tryprofound.comJulius AI - https://julius.aiturbopuffer - https://turbopuffer.comfal - https://fal.aiPrivy - https://privy.ioCognition - https://cognition.aiGemini - https://gemini.google.comFollow TBPN: https://TBPN.comhttps://x.com/tbpnhttps://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235https://www.youtube.com/@TBPNLive
Transcript
Discussion (0)
Today I wrote about will AWS buy TPUs from Google in the front page of the Wall Street Journal's business and finance section.
They're singing the Traneum chips praises.
Amazon's chips pose risk to NVIDIA.
The whole week we've been talking to people.
Is that clickbait?
I don't know.
What we're going to find out.
We'll see.
It certainly doesn't seem, you know, good to have more competition in the market.
There's a lot of losers if Google winds up winning with TPU.
The losers came out to fight, apparently.
Amazon.com is the latest big tech company
to muscle in on NVIDIA's turf.
Give me a sound cue from the fall sound.
Muscle in.
How about this?
There we go.
That's right.
On Tuesday, Amazon Web Services announced the public launch
of its Trainium 3 custom AI chip,
which it says is four times as fast
as its previous generation of artificial intelligence chips.
4X speed up. That's actually very significant.
That's great.
The company said Trainium 3,
produced by AWS's Anapurna Labs, fascinating company,
acquired a decade ago for around 350 billion,
or 350 million.
So it's pretty small acquisition, actually,
350 million.
In AI, you never know.
But back then, you start a custom silicon company.
You could barely clear nine figures on the way out the door.
Inapurna Labs has been working on custom silicon
for Amazon for a long time.
They actually do have a custom CPU at AWS
to accelerate CPU-based workloads.
Then for the last few years, they've been working on GPUs or, you know, A6 for accelerated workloads.
And so this custom chip design business, Anapurna Labs, can reduce the cost of training and operating AI models by up to 50% compared with systems that use equivalent GPUs.
The chips are meant to provide a stronger backbone of computing power for software developers like Dean Leiters, Lerf, the co-founder and co-founder.
and co an executive chief executive officer of the startup de cart who we had on the show and
de cart uh has that is valued now at 3.1 billion dollars let's go so if you don't remember
decart came on and dean uh was doing live AI video generation while he was doing the interview
with us it was really crazy yeah he basically yeah it was real time yeah he looked like he was
in a video game but it was happening with uh little to no delay uh
really, really cool demo. He said his company had a breakthrough, enabled by a Traneum
3 chip, by the Traneum 3 chip, after trying out several other competitor chips, including
invidious processors. Dozens of programmers and AI researchers from his San Francisco
based company had been trying four months to train a version of Descartes flagship AI-powered
video generation application known as Lucy, that would be able to render footage in real-time
without bugs or hiccups.
ABS gave Descartes early access
to Training 3 after meeting
with the startup and being impressed
with the founders. The company was
two weeks into a marathon coding session
in a rented house in Silicon Valley, which I think he
took us on a tour of while he was in
wizard land and AI-generated
sci-fi world. It was very fun.
That a few of his
employees were celebrating wildly behind
him. The moment that I saw
it worked, I saw four people
just start jumping up and down.
said, Dean, the next question was how fast can we get it to market and start changing industries
with it? The launch of Traneum 3 is the latest broad side against Nvidia, which dominates the
GPU market. A flurry of deals in recent months have caught the attention of investors
indicating that more AI firms are seeking to diversify their suppliers by buying chips.
Meta Platforms is in talk with Google to buy billions of dollars worth of advanced AI processors
known as TPUs, and OpenAI has struck deals with
NVIDIA rival AMD as well as Broadcom.
Very exciting that Descartes got good results
out of the Traneum chip.
That's awesome, obviously.
I'm sure everyone over at Amazon has been working very hard on that.
At the same time, we've heard that Anthropic
maybe didn't have that great of an experience with Traneum,
and that's why maybe they're moving over to TPU a little bit more.
Even though Amazon remains a major shareholder in Anthropic.
And so my question is, will AWS buy TPU from Google?
I asked Matt Garman that question.
You asked me that question.
Yes.
I said they will be mocked.
They would be mocked.
Which is ridiculous.
Where I just like, please my arch rival, can I please get some chips for my data center to compete with your data center?
Okay.
Well, let's actually go to what Matt Garman, the CEO of AWS said on TBPN yesterday because I asked him, will you be buying TPUs?
And he said, hey, look, we're very excited about Traneum.
and we think it has enormous potential,
and we absolutely think there's a benefit
to optimizing every layer of that stack.
People were joking on the timeline,
you know, oh, there's this new Traynium chip
and somebody was like,
all five people using Traynium are ecstatic,
you know, that there's this new news.
Amazon's so bad at hype.
Traynium is used by 500 million people through bedrock,
but their marketing team just can't.
AWS is undervalue, blah, blah, blah,
and he's obviously a bull on the stock.
But what's interesting is that, like,
it is it is deployed i'm out some of their gtm staff today let's just say you'll have years to
accumulate stock at cheap prices there's funny yes there obviously is value even if trainium winds up
being for a particular niche like maybe it's for real-time video like maybe that's what it gets
really good at the thing with real-time video that's interesting something that decart is focused on
is working with live streamers specifically on twitch yeah amazon owns twitch oh that'd be cool
That makes that kind of partnership more interesting.
I like that.
So obviously there is value to saying,
hey, if you go to AWS, you can get bedrock and some services
that have been fine-tuned specifically for Traneum.
You go all the way down.
You're going to get very good performance
because we have a stack from top to bottom that's very efficient.
But at the same time, if you're trying to do something
that's sort of like not within the Traynium ecosystem,
you might have a rough go.
You might wind up on a different chip.
But he did say something.
He said, we are going to support choice for our customers as well.
And so we'll continue to offer GPUs from Nvidia as an example.
And we'll have, and we have a very tight partnership there.
So this idea of customer choice, I think, is important.
And if you go back to Jeff Bezos, he said, we're not competitor obsessed.
This idea that Google is there arch rival, that's not in Amazon's DNA.
Jeff Bezos said, we're not competitor obsessed.
We're customer obsessed.
We're customer obsessed.
And so if the customer says, look,
it's great that you acquired Annapurna Labs for $350 million.
I'm really happy with what you've done with Traneum 3.
It doesn't work for me.
I'm the customer, and I want you to give me an Nvidia GPU
in your server or in your data center,
or I want you to give me a TPU in your server.
They might do that, because that's actually
in Amazon's DNA.
Yeah, and then the follow-up question
is, is there any world where Google sells TPU to Amazon?
Already, they are partnering.
Like this was another partnership that came out.
Separately, there was an announcement of an AWS partnership
with Google Cloud.
Now, they aren't buying TPUs, but what they're doing
is they're enabling customers to establish private,
high-speed links between the two companies
computing platforms in minutes instead of weeks.
And so the general idea here is that Google has some amazing
AI capabilities that customers are just struggling
to match on AWS at this point.
And the same thing's happening on Microsoft as well.
because on Azure, you have access to open AI models that you might not have access to on ABS.
And so even though your whole infrastructure might be on AWS, you might be going back and forth to GCP constantly.
Companies used to think about AI as a special piece of their application, so it would be fine to bounce around to another cloud to get the best possible results.
But if the next generation of companies, I'm sure we'll talk to some of the AI-focused YC Demo Day companies today about this.
I hope there's at least one.
I hope there's at least one company that's doing something with AI.
That would be a real treat.
So it used to be fine to bounce around.
Now the next generation companies,
they're maybe making their entire infrastructure decision
based on who has the best AI products.
What do you laughing at?
I'm laughing because I texted Simon.
They have a turbo puffer has a booth at AWS.
I said, how's it going at Reinvent?
And he says, I'm not there.
I just make it seem like I'm there as a joke
because the VCs keep going to the booth.
And then our growth intern is like, oh, Simon, I don't know.
I think I saw him over there.
Just continuing to mug while ARR sky rockets.
Amazon needs to fight back against this and allowing high-speed interconnect between ABS and GCP.
Solves a piece of that, but will they go further?
NVIDIA has an insane amount of power right now.
They've just ramped full-year revenue from $27 billion in 2023 to $60 billion in 2024 to a
$130 billion in 2025.
That's like one of the greatest revenue ramps at scale in history.
And then also they grew their net profit margin from 16% to 56%.
That's insane.
Insane.
Yes, goat.
That's why Jensen Wong is on Joe Rogan, and I'm sure it's going to be a fantastic
episode.
All the hyperscalers and Open AI, but that creates problems, right?
Because all the hyperscalers and Open AI are now sort of incentivized to form a bit of an
anti-NVIDIA alliance to commoditize the accelerator market
and drive down those margins a bit.
So 56% net profit margins on 130 billion of revenue.
People are just sitting there and they're like,
there's 50 billion dollars of profit over there.
Like that's a lot of acquisitions.
And that's our-
That's our costs.
Yeah, that's our costs.
Like you're just eating a lot off of these plates.
How much do you think it hurts Amazon
that they don't have a dedicated podcast guy?
Like they don't have a Sholto, they don't have a Sam, they don't have a Satya.
You know how much that hurts because they definitely have someone in that role.
You just don't know them.
That's what I'm saying.
Yeah, they don't have someone who's person.
They might have the title, but they're not really in the driver's seat, right?
Yeah, they don't have a Rune.
They don't have a Rune, right?
They don't have a Sholto.
Yeah, they should step it up.
They should definitely get someone.
Fortunately, I mean, the semi-analysis crew was over there,
taking pictures, sharing photos in the timeline of the Traneum 3, ultra-server, liquid-cooled,
with a lot of hard eyes.
That's a glowing endorsement
from the semi-analysis crew.
And look at this.
Very purple.
I wonder if that's like intentional.
Google is having this kind of success
with TPUs.
What about Amazon's Traneum?
Traneum is new and underpowered.
Just 667 T-flops BF-16.
It has lots of HBM,
but the bandwidth is lower
than the H-100, TPUV-6E.
This is competitive H-100,
not on HBM or bandwidth.
And Ironwood is competitive
with Blackwell on Flops,
bandwidth, and HBM capacity.
I expect Ironwood to quickly gain market share as it ramps up, as you can see from throughput
slash TCO, NVIDIA versus Traneum. Ruben Mogs Traneum 3 harder than Blackwell versus
Traneum 2 on TCO training flops and reduces the gap by 5% on TCO Memb bandwidth.
So the gap between Nvidia and Traneum is actually increasing rather than decreasing.
By the way, this math was done before CPX was introduced.
I won't be surprised if CPX plus Rubin is cheaper than Traneum for inference.
And so I do think that there's a world where there's something specialized, like what's going on with Descartes, some sort of special model that thrives in what Traynium is good at and they can further niche down.
But we'll see.
I mean, maybe they come from behind and they just destroy TPU and we're all talking about training next year.
We're going to say a little rest in peace.
Rest in peace.
To Claw.
San Francisco's beloved albino alligator has passed away at age 30.
That's a good age.
I don't know how long alligators typically live,
but I'm glad looking it up.
It feels like 30 to 50 years for the American alligator.
A little bit short, but Claude was, of course,
often reaching 70 years or more.
Yes.
You know, obviously people started speculating immediately.
Anthropic, of course, was the sponsor of Claude.
Yes.
And, you know, people were wondering, was there foul play involved?
was it possible this poor dinosaur not dinosaur alligator passed the day that they uh that that it that it got
announced that they've hired IPO lawyers some people were speculating could is it possible
claude was sacrificed to the capital markets gods and some type of ritual but anyways he
look at it look at this expression he has on his face can we zoom in a little bit what uh
What a cool guy.
And he will be remembered.
Yeah.
Trump administration will invest $150 million
into a lithography startup called X-Light.
Its first Chips Act award, chat this morning
with X-Light CEO.
There's a few lithography companies now.
We've had some on the show.
It's a very interesting tier of investment,
like $150 million from the government that feels like a series B.
They did raise a series B this past summer,
led by Playground Global, makes sense that the government's
investing in Intel, Pat Gelsinger, of course, former Intel CEO. Now he's getting involved in
X-Light, marshaled 40 million of capital, went and got 150 from the government. There's also
another AI startup that wants to remake the $800 billion chip industry. This one's in the Wall
Street Journal, founded by ex-Google researchers recursive intelligence, raised 35 million with backing from
Sequoia to automate chip design. Obviously, this is not lithography. This is the design process, but still
This is AI for AI chip design.
Oh, that's right, yes.
On a quiet residential street, a few blocks from Stanford University.
Two former Google researchers are launching a startup.
They hope we'll remake the $800 billion chip industry,
trying to build software that can automate the design of cutting edge chips,
a prospect that would allow every company to build their own chips from scratch,
working from the top floor of a suburban home.
The duo recently raised $35 million to kickstart recursive intelligence
with funding from Sequoia Capital.
Sorry, the recursive.
Just putting it in the name.
We got to add that to the list of, because there's standard capital, modern capital,
standard intelligence, modern intelligence.
Raw intelligence.
Wow, the company, 35 million for evaluation of 750 million, that's very low dilution.
What, 5% or something like that?
VCs were mocked.
Yeah, I would have assumed this would be a very capital-intensive business, but I suppose
if it's just a software that they're developing, maybe they have more control.
Here. Thoughts on AI progress. He says he's moderately bearish in the short term, but explosively
bullish in the long term. Very interesting. So he says he's confused why some people have short
timelines. They say AGI is coming soon. But at the same time, they're bullish on RLVR, which is
reinforcement learning with verifiable rewards. He says, if we're actually close to a human-like learner,
this whole approach is doomed. Currently, the labs are trying to bake in a bunch of
of skills into these models through mid-training, there's an entire supply chain of companies
building RL environments which teach the model how to use Excel to write financial models.
For example, I think we're actually talking to an AI Excel analyst for Excel power users
called crunched at 1250 YC company.
In the context of when is AGI arrive, when does superintelligence arrive, I understand,
to our question, either these models will soon learn on the job in a self-directed way,
making all of this pre-baking pointless, or they won't, which means AGI is not imminent.
Humans don't have to go through a special training phase where they need to rehearse
every single piece of software we might ever use.
When we see frontier models improving at various benchmarks, we should not, we should think
not just of increased scale and clever ML research ideas, but billions of dollars spent
paying PhDs, MDs, and other experts.
One counterargument I've heard from the takeoff within five years, crew, is that we have to do
this clue gRL in service of building a superhuman AI researcher and then the million copies of
automated ilia can go figure out how to solve robust and efficient learning from experience this gives
the vibes of we're losing money on every sale but we'll make it up in volume this automated researcher
is somehow going to figure out the algorithm for aGI something humans have been banging their heads
against for the better part of a century while not having the basic learning capabilities that children have
That seems super implausible to me.
You've been asking about economic diffusion, what is the rate that we're diffusing?
Let's see what Dwar Keshe has to say about economic diffusion.
He says that economic diffusion lag is cope for missing capabilities.
I'm very sympathetic to this because when I go to the doctor's office and they hand me a piece of paper,
I know that a web form is good enough.
The capabilities of the digital form are complete.
No, it's just a diffusion problem.
There's just someone who runs that doctor's office
is like, I like doing it the old way, right?
And that's the economic diffusion lag problem
that I think is real in a lot of scenarios.
Right now, AI is great at generating text, right?
It's great at kind of analyzing a piece of content
and then generating text based on that.
And yet we still have multiple people
on the team at TPPN, who's
his job is to find interesting moments of the show and then create captions around that
and share it to X and Instagram and YouTube and other platforms.
And Dorcashton's parents that too, where he was trying to find the most interesting pieces
of a full podcast with one big Gemini prompt and he was trying all the different models
and couldn't get it to actually find like the most salient and viral points.
The other thing that stands out is like one of the seeming missing capabilities is like
ability to like identify humor or even something like it's almost emotional.
So Ilya and Dwar Keshe talked about this where I think Ilya was giving the example of
scientists studied people who had had various brain injuries that limited their ability
to experience emotion.
And when they took out emotion, it took them, it would, it can take somebody two hours
to figure out which pair of socks to choose.
And they were kind of like stunned.
Like it's just a pair of socks.
You know what's going on in your day.
Why do you need emotion in order to make that kind of decision?
And so it seems like at least in AI, a missing capability is like, okay, finding out like what's an interesting moment of a podcast, right?
Is it something that makes the audience member feel something, right?
The other thing that's notable is like on WOP, one of the best, one of like the top jobs that people do on WAP or way they make their first dollar online is just like clipping for various content creators.
and media companies.
And some of the clips that they make are so sloppy.
Like, it's literally just like a random segment of the show,
and they're blasting it out from like 20 different accounts.
And the fact that we're still paying humans to do that still.
I mean, it just feels notable.
Well, let's read Dwar Keshe's take on economic diffusion lab lag being cope from missing
capabilities.
Sometimes people will say that the reason that AIs aren't more widely deployed across
firms and already providing lots of value outside of coding is that technology takes a long time
to diffuse. Dorcasch thinks this is cope. He says people are using this cope to gloss over the
fact that these models just lack the capabilities necessary for broad economic value.
This is new technologies take a long time to integrate into the economy. Well, ask yourself,
how do highly skilled, experienced and entrepreneurial immigrant humans manage to integrate into
the economy immediately? Once you've answered that question,
note that AGI will be able to do those things too.
If these models were actually like humans on a server,
they'd diffuse incredibly quickly.
In fact, they'd be so much easier to integrate
and onboard than a normal human employee.
They could read your entire Slack and drive in minutes
and immediately distill all the skills
that your other AI employees have.
Yeah, I agree with that.
The one thing that I don't necessarily agree with here,
he says, well, ask yourself,
this quote from Steven Burns,
How do highly skilled experience and entrepreneurial immigrant humans manage to integrate into the economy immediately?
I mean, they do sort of integrate into the economy immediately, but like the immigration flow is like a slow process.
Like it doesn't just happen immediately.
It's not just like, you know, the amount of immigration went from like zero to like, I don't know, a million people or something.
Like it's like people move around.
There is like a, there is a bit of a drag.
But I understand what he's saying here.
It does make sense.
Silicon Valley is rallying behind a guy who sucks.
It's like, what does that mean?
Pure ad hominin.
It's rage bait.
It's going to go hard.
It already got a thousand likes.
On a linked article, the Verge is not putting up a thousand likes per link.
So this is outperformance.
And it's heavily paywalled.
You cannot learn how David Sacks sucks without subscribing to that thing.
They did a good job.
You got to pay.
You want to know why he sucks.
That'd be really fun.
if behind the paywall is like we're just kidding he's actually awesome that we think the
New York Times missed on this one the startup told me that one of their
investors didn't like that they were selling to newly founded startups and
wanted them to sell the bigger companies who have more money if investors tell
you this write them off as idiots selling to startups is the best thing you can do
I'm sure many of the companies we're talking with today we'll be selling to
other companies in the batch a lot of people say that's bad yeah
They try to say, like, Y.C. is a circular economy, but you have to ignore the hundreds of, you know, very real businesses that have, you know, been created through YC and gone on to work with every kind of company in the world.
Yeah. Even if there's a, you know, some sort of insular circular economy in the startup ecosystem, like there's a pretty immense amount of pressure to actually deliver something that's valuable.
Yeah, they're being rational. It's not like.
It's not like, I'm sure there's been small instances where companies were actually, you know, had somewhat bad behavior.
But in general, it's like if I'm going to pay for the SaaS tool or the beta that you're running, it has to be good.
So there's a $1.5 billion judgment against Anthropic for including 480,000 books in training their AIs.
Five of my books are among them.
Where it is, there might be $1,500 payout per book, according to my agent, me.
Max Brockman. I wrote to my agent Max the following. If any payment comes to me, please send it back
to Anthropic with my thanks for including my books and their AIs. The judgment website offers
a way to opt out of the payment, but I found it cumbersome. So I didn't. I'm principled, but too
lazy to be highly principled. The secondary markets are rife with fraud and bad actors,
and it pains me to see these bottom feeders profiting off of Anderil's growth while fleecing
retail investors through unreasonable or opaque fee structures in this week's episode of nonsense.
Ignite VC, a fund we've never taken a meeting with or had any contact with whatsoever,
founded by Brian, who we've never met as soliciting investors via public Google Doc to invest in an
SPV that will in turn invest in another SPV that will in turn potentially enter into a forward
contract with a supposedly, though, unnamed early and oral employee. A few problems here.
First off, so-called forward contracts are notoriously hard to settle in private companies,
and counterparty risk is extremely real. What about the many complicated
corner cases like acquisitions where shares don't trade or marriages, divorces, or deaths where
ownership of the underlying shares is complicated. Just generally a risky structure to close that I don't
think most folks actually understand. Yeah, if you enter into a forward contract and you basically
buy the right to the future value of some shares and then somebody gets, you know, again,
married or divorce or passes away or bankruptcy is another situation where you might not be actually
able to collect, even if your investment should have generated some return. Matt says,
second, this deal memo includes basically no details about Anderil's performance, no revenue figures
whatsoever, no product specifics. I guess that's good, right? Like, if they were just floating
around information that they had acquired, almost as if it's soliciting investors to invest
on hype and momentum and not fundamentals. Generally, I'd advise folks to be skeptical of any
deal memo lacking basic details. Third, forward contracts are explicitly disallowed.
by Anderil's stock plan and bylaws, which means that Anderol will never consent a team
ignites SPV actually taking possession of these shares while we are privately held, zero chance.
And finally, the memo spends most of its time talking about the structure and fees, which are
insane.
A double-layered SPV with all legal and admin costs pass through in addition to an 8% upfront
fee, 3% annual fee for two years, 20% carried interest, and the craziest part, an implied price
per share that is completely insane.
In this case, the implied PPS is 115% higher than the most recent preferred race from nine months ago.
Flattered, I suppose, but also puts these investors in an almost absurd position by paying more than double the price per share of our most recent transaction.
As stated, at the top, I don't know Brian or Team Ignite at all.
Maybe they're kind of wholesome people, and this is all a big misunderstanding.
But if I were an investor looking at this, quote, I'd run for the hills.
And I believe the founder replied and said, appreciate the heads up.
The document reference was an internal draft prepared for discussion with an existing LPN was not intended for public circulation.
It appears someone shared it without authorization, and we're looking into how that happened.
But do you see what...
There's like seven people that share a screenshot of like a direct email we got with this exact memo.
Okay, and the other thing is they say not soliciting investment for any Anderol-related vehicle.
Matt says, really, the draft was written by your founder and managing partner.
I literally watched him edit the doc in real time.
And he has a screenshot of, like, the founder's name in Google Doc.
What a mess.
Don't do this.
Don't do it.
Instead, why don't you start a company and apply to Y Combinator, build an actual business
instead of going around hustling SPVs and companies that don't want to sell shares.
We will talk to you later.
Cheers.
Goodbye.
