Limitless Podcast - Cerebras IPO: The Tech Breakthrough That Could Change Everything
Episode Date: May 12, 2026Cerebras is an AI chip company set for a groundbreaking IPO, and its revolutionary chip that could accelerate the development of AGI from 15 years to just 5. We explore the implications of t...heir unique chip architecture within the context of their partnership with OpenAI. As Cerebras positions itself to raise $4.8 billion, we analyze how its innovations could disrupt NVIDIA's monopoly and shape the future of the AI market.------🌌 LIMITLESS HQ ⬇️NEWSLETTER: https://limitlessft.substack.com/FOLLOW ON X: https://x.com/LimitlessFTSPOTIFY: https://open.spotify.com/show/5oV29YUL8AzzwXkxEXlRMQAPPLE: https://podcasts.apple.com/us/podcast/limitless-podcast/id1813210890RSS FEED: https://limitlessft.substack.com/------TIMESTAMPS0:00 Intro1:30 Cerebras Chip Design3:22 OpenAI's Investment5:29 IPO7:37 Inference vs. Training11:59 Memory13:34 Distribution16:17 The Bear Case18:04 The IPO Landscape20:50 Investment Strategies22:04 Closing Thoughts------RESOURCESJosh: https://x.com/JoshKaleEjaaz: https://x.com/cryptopunk7213------Not financial or tax advice. See our investment disclosures here:https://www.bankless.com/disclosures
Transcript
Discussion (0)
There's a chip being built right now in California that the founders of Open AI believe is so important that getting access to it could decide who builds AGI first.
In fact, the co-founder of Open AI just testified last week that he and Ilius Satskever calculated it would take 15 years to build AGI.
Until they discovered this hardware that could get them there in just five.
The name of this company is called Cerebrus and they're actually going public tomorrow.
For the longest time, it has been a unanimous consensus of who the king of AI is.
It's been Invidia and they've owned the...
entire moat on AI AI chips specifically. And this moat has made them famous and pretty valuable.
They are worth $5.3 trillion, the most valuable company in the world. But with success, there's a
double-edged sword. And it highlighted a bunch of constraints that invidia GPUs couldn't really fix.
One of these constraints is something known as inference. GPUs that Nvidia create aren't as hyper-optimized
towards that. And that is the issue that Cerebrus focuses on specifically. They create these chips,
which generate lightning fast inference.
What this means is if your AI models run on these Cerebrus chips, they are lightning
quick.
And in the current AI industry, we're heading towards a world where inference is valued and matters
way more than AI training for models.
In fact, it's estimated to be 10 to 50x as large by J.B. Morgan in their latest report.
So the point is, if Cerebrus's chips can fulfill that constraint found for inference for
AI models specifically, this could be the first real threat that.
invidia faces on their own. Okay, so let's talk about briefly, before we get into the IPO,
what makes Cerebrus so special, so different? Because clearly there's some novel architecture here.
And perhaps we can use this describing it as the pizza problem, where every chip you've ever
owned, it kind of starts the same way. It takes this silicon wafer about the size of a dinner
plate here that I have for scale. So it's this big giant thing, and it gets cut into a series of
tiny little chips that goes into your iPhone or your iPod or your MacBook or whatever it is.
But Cerebris decided to question that. They said, what if you don't cut the pizza? What if the whole
whole pizza is the chip, and that's exactly what they did. They just took that giant wafer,
and instead of chopping it up into smaller chips, they just created it all in one big thing.
And they had this breakthrough that allowed them to fit 1.2 trillion transistors on a single
wafer as opposed to someone like Nvidia who currently has 21.5 billion. And we could see
the discrepancies on the screen here. It is a huge amount of additional compute that you could
put on a single chip. Now, what is that result in? Because the information doesn't have to travel as far,
the chip is able to access it all so much faster.
So they put all of the memory on the chip, all of the processes on the chip, and the result
is just significantly faster AI that works for the entire stack between inference all the
way up to pre-training.
And this is a really huge win for a lot of the industry.
It's a big breakthrough.
And it's not just a theoretical concept, is it either?
So Open AI, about probably like six months ago, invested $10 billion into their company,
and they acquired the rights to basically take over all their chip designs and use it
for OpenAIA models specifically.
And we've seen that manifest in their existing model.
So Open Air has this world leading coding AI model, right?
It's called Codex.
But what most people don't know is there's a fast mode for codex.
It's called GPT Codex Spark.
And it is absolutely rapid.
There's like near zero latency.
It's running on Cerebrus chip.
So the concept isn't a concept anymore.
It's proven at scale to work with these big models.
And if you are someone that is working on the frontier,
of software engineering and every iteration cycle,
the speed at which you design and ship software matters a lot.
If you're in financial services and you need to get that product out
or make that trade, you need the fastest chips.
So these chips are incredibly fast.
I mean, we're talking 50 times more data on a chip than some of the Nvidia chips.
But the result is that it's very expensive.
These chips are not cheap to make.
The good thing is, is that money doesn't seem to be a problem
when it comes to a lot of these AI companies.
And OpenAI has committed to what 10 to 20 billion?
million dollars already. And this is taking us to the headline news of the day, which is the
IPO. The company that everyone is super excited about in terms of accelerating the rate that we get
to EGI is actually going public. And the numbers are pretty big big deal. This is the first AI
hardware company to really go public as we're going on this parabolic trajectory. And I think
the numbers are going to get pretty staggering in a very quick way. Yeah. So to give some context on
this, about a week and a half ago, Cerebrus announcers or basically files for their IPO,
And there's a prospectus that comes out, right?
Which explains basically, you know, why they're raising and how much specifically they're going to raise.
A week and a half ago, that number was $3.5 billion that they were going to raise.
And the price per share was roughly around $115.15.
Something happened immediately after that, though, which is they became 20x over subscribed.
So there were 20 times more people that wanted to invest in this thing, qualified institutional, et cetera, for their IPO,
such that they had to revise the terms, John.
So this ended up issuing 2 million more shares at a higher expensive price per share.
It's 150 bucks.
So the total amount that they're raising is actually $4.8 billion, call it $5 billion, $1.3 billion above what they were expecting to do,
which tells me that there is an insatiable amount of demand for this particular IPO.
Everyone's talking about SpaceX.
They're talking about Anthropic and Open AI's potential IPOs later this year.
But this is one of the real IPOs that show that there's a huge demand in the West for AI.
companies, and it's exciting to see because this is the first of many. Yeah, well, we have
Cerebris kicking things off. Then we have the rumor for SpaceX, Open AI, Anthropic, Databrox,
perhaps even Stripe. That's a combined potential $3.6 trillion in market cap that could be
entering the market. And we have this hotball of money. We just filmed an amazing episode last week
about the AI stack and where every company kind of lines up in where the opportunities are for this
new era of AI investing. And this sits very squarely right in the hardware era right now. Right.
next to someone kind of like Invidia, right?
Like, they're not quite Nvidia's size.
They're nowhere near Nvidia size,
but they're competing on the same kind of level,
the same type of architecture that a company like Nvidia is
just at a very different angle using these huge pizza-sized wafers.
Yeah, so I point and want to make on the Nvidia thing specifically is,
okay, so there's one framing of this where if the Cerebrus IPO is successful,
it's the first real signal that Nvidia's monocally on AI chips is actually going to be threatened,
right?
And this might sound like a crazy thing to say, right?
It's a small company that raising $5 billion versus Nvidia's $5.3 trillion.
But there's another company which proved that Nvidia's mode might also be shakable, right?
And that is Google last week who became the most valuable company.
I don't think they're currently the most valuable company, but they beat Nvidia.
And the reason why they did that was because they released these brand new TPUs,
their version of their own GPUs, which they train their Gemini model on,
which so many customers, including Anthropic and opening I want to use to train and inference their own model.
So the point is the ground is shifting beneath Nvidia, even though they have like decades and decades worth of experience building these GPUs, there might still be valid compatible threats against their particular mode.
The other thing I want to say is, and I mentioned this earlier, inference versus training.
For the longest time, people have said AI training is where all the money is going to be, and that's where Nvidia has lived and breathed over the last decade.
But what has shifted recently is what matters more is what happens after the training.
How many people are going to be using agents that query the models, inferences where the money is going to be made, where the value of the opportunity is going to be huge.
And Cerebrus plays directly in that point.
Invidia actually proved this because they acquired another company called GROC for $20 billion, which is just something similar to what Cerebris did.
So it's validating the fact that Cerebrus is going to be valuable, and now the public can get exposure to it.
It's cool.
And I have some older benchmarks about how fast Cerebra runs versus the rest of the world.
Taking a model like Lama for Maverick, Cerebrus was delivering at 2,500 tokens per second, which is more than two times NVIDIA's flagship Blackwell Chip.
And then if you lower it down to Lama 3.1, which is a $70 billion, 70 billion parameter model, it was 20 times faster than Nvidia.
So the actual rate that is able to deliver these tokens is huge.
And one of the most important things that we're seeing right now is kind of reasoning,
chain of thought, deep thinking of these agents. And when you're able to deliver tokens at a much
faster rate, you can get from the question to the answer much quicker. One of the new things
that we've seen in Codex recently that has become super popular is the backslash goal command,
where you can set it, a command, it'll go off and think for 24, 36, 72 hours of time
to collect enough information to get to a single answer. This can make that process significantly
quicker and therefore make agents much more capable. So the reasoning speed in terms of inference,
in terms of delivering reasoning, it's so important. And although currently, I believe fast mode
on chat GPT is about 1.5 times the cost, you have to suspect that that cost is going to go down at
some point. And that makes Cerebus a really compelling company. And I think that's what opening
I saw when they made this huge investment in them. Now, EJAS, I have to ask you, what are you thinking
about this IPO? Is this something that you're interested in investing in? It's a tricky one, because
my default answer is yes, but if I want to like touch Gras for a second to get off my
my horse, the IPO is kind of already happening. It's just happening in the private market,
if that makes sense. Like, think about this, right? In Video when they IPOed, I believe was,
they appeared at around $800 million. That's million. It's under a billion dollars, right? And currently
we have like Cerebras, which, you know, they have one major customer as Open AI. They've proved the
concept farewell. But they're raising $1.4.000.
$5 billion, and this is all pre-IPO. So there's a lot of people that are making a lot of money.
We see this with the Anthropic secondary shares, Open AI, the same thing with SpaceX as well,
before the actual IPO happened. So the skeptics are saying this is just going to be a retail
liquidity event where they're going to dump on retail and, you know, no one's going to really
kind of make any money after that. I don't think there's a zero chance of that happening,
but there is a chance. Now, the bull case for this IPO is one specific reason, which is there is
going to be an insatiable demand for two things, Josh. One is, I need to use this AI to do things
for me, whether it's agents or whether it's me prompting. Actually, I'm more bullish on the agent's
side, but we can tap into that as a second thing. Second thing is Cerebrus' chips has something very
unique that Vidaeatia chips can't get enough of. And that's something called memory,
specifically static random access memory. It's a very unique type of memory which Cerebrus installs
into their chips, and it's actually what makes them incredibly unique.
Now, if you look at all the typical 99.9% of GPUs that people use, they use this thing
called high bandwidth memory.
It's composed of something known as Dynamic Random Access Memory.
It's kind of like the sister to static random access memory.
It's cheaper.
It's more scalable.
It's the sexier thing right now.
It's what all the major memory manufacturers have focused on for the last decade.
And that's what's being installed by default into these GPUs.
But there's one issue.
They don't deliver the information that the memory stores very effectively or quickly enough to allow these models to run very quickly.
Now, one solution to this is static random access memory, which Cerebus is focused on for their entirety of their existence.
And they have perfected the design and installation method for their particular chips.
There's a reason why it looks so much larger.
You know, you use that dinner plate just now, Josh, versus the invidia chips.
It is more expensive.
You mentioned that, like, the cost to use the things is more expensive.
But think about how much more value you could get by using quicker turns of your AI model.
You could end up having a super fast quant algorithm to trade your fund or to deliver access to a product for millions of your different customers.
And that in itself might be worth billions and billions more than having a slower memory component.
So that is the clear distinction between them and what makes me the most bullish on Cerebus IPO specifically.
Yeah, as I was reading about it earlier this morning, the S-Ram versus D-Ram conversation,
one of the easiest ways that I was able to kind of differentiate between the two is an S-RAM is
much more similar to an S-S-D, a solid-state drive, whereas D-Ram is kind of similar to
a HD, a hard drive, a hard-disc drive, something that has an actual needle, where it has a lot
more capacity, but it's a bit slower.
Meanwhile, S-Ram, so long as there is power provided to it, it will retain that memory
and information indefinitely.
So in D-RAM, one of the big problems that happens is there is this cache memory, and it
has to continue to refresh itself over and over because it's a very short shelf life. With
SRAM, that shelf life is infinite and it can be stored as long as there's power and it can
be recalled very quickly without that constant refresh. And that's the huge breakthrough. So it's a
really compelling product that I think is very impressive and very necessary in this world. The only
real other GPU architecture we've seen that has had success is GROC. And we saw what happened
with them. They had an amazing exit and they are working now directly with Nvidia. So this
is the direct comparison. And for that reason, I think I'm pretty bullish on the company because
essentially Cerebrus is tied to Open A.I's success, right? We know that both founders are actually
invested personally. Greg Alton and Greg Brockman and Sam Altman are both personally invested in
the company. Open AI is invested in the company. And Open AI. has every incentive in the world
to scale with them and to grow with Cerebris. That aligned with the fact that this is the first
AI IPO we've really had in this chain that's going to kick things off. I can,
expect there to be a lot of excitement, a lot of enthusiasm. So long as the market continues to go
up, so long as this token demand continues, I see no reason why Cerebrus wouldn't follow in that
path and actually be somewhat successful. I think I'd love for everyone who's watching to share
their takes. Are you investing in the IPO? Why? Why not? It's going to be a big conversation over
the next coming weeks as we go through this thing. Another thing is like, it's one thing having a good
product, right? It's another thing like getting the right distribution. Now them signing this like 20 billion
dollar partnership with Open AI is good, right? You get access to, what did they have now?
Almost like a billion weekly active users or something crazy like that. The other thing is
their chips are integrated into Amazon's bedrock platform. I don't know if you knew this, Josh,
but Amazon basically owns the entire enterprise compute market because they give access to every
enterprise that wants to spin up an AWS instance or get access to specific tools through
AWS, their Amazon Web Service.
Now, they run Cerebrose trips themselves there.
So Cerebrus now has a distribution mode through that, right?
So the point is distribution is solved.
The ingenuity of going zero to one with S-RAM is also solved,
and they have the consumer mode being funneled through Open AI.
So all of this sounds incredibly bullish,
but I do want to touch on like the bare case for a very, for a very brief moment, right?
And I mentioned the first one earlier on,
which is the valuation is pretty steep, right?
the fact that they were 20x oversubscribed and they just revised the terms of the last five days to give even more access.
It's kind of, I'm just like, so typically, just for those of you who don't know, for the average IPO over the last, I think, five years, the price on IPO pops up between 30 to 80%.
So if you're a retailer, right, that wants to get involved here and like buy this thing, you might be buying it for like a significant premium and these things tend to dump a little bit after the first couple of days and then, you know, whatever, it figures itself out.
So the valuation is pretty steep.
The other thing I want to say is that
Open AI is also building their own silicon.
They have a deal with Broadcom MediaTech
and a bunch of these other intermediaries
to design their own chip.
And Sam has specifically said
that one of the major cases that he's trying to solve
is inference.
So I'm kind of looking at this and I'm like,
okay, so you've invested in this company,
you're running their chips,
it's doing the thing,
but you're also building your own custom silicon.
Now, granted, that's probably going to take
years to try and figure out. But is the idea that you eventually acquire kind of Cerebus,
but they're IPOing? So I guess you can't really do this and merge them together? Or are you going to
eventually ditch them and they're like an intermediary solution? And so Cerebrus might end up
like losing this customer going down the road. I don't know, but those are the two things that
sound out for me. That's fair. Yeah, there's also, they're trying to compete against the Cuda
moat, which is, I mean, very powerful and very, very strong, as we know. They also are, I mean,
they're trading devaluation point 51 times revenue. A lot of these larger
AI companies are trading actually at fairly low multiples right now. A lot of the major players in the
Mag 7 are no more than like a 20 times revenue. This is F-51. And there's also one interesting
component where there is actually a memory ceiling currently on the chips where they're able to fit
44 gigabytes on a chip. And currently, that's unbelievable. But it requires a lot more
innovation if they're going to continue to scale to provide inference for multimodality and for
video models or for ultra-long context. There's going to need to be continued.
innovation and I think that the ceiling is not quite as clear. Like we're not quite clear
where this can go to. We know it's incredible now. I'm not sure people really know where that's
going to go to. That's perhaps a longer term problem. That's not something that would really
impact the company short term, but it is something to be noteworthy of. I mean, this is,
it's an amazing technology. It's like in your computer, if you have your RAM and your hard drive
has separate components, this merges them into one. And now everything is so much faster because
of it and the unlock that has for generating tokens, which is the single most important thing that we
do is huge. So I think as a company, Cerebris is something to be very excited about. This is going to
be a fascinating use case or a fascinating case study as they go public this week to see what
that type of impact has in the market. And overall, I'm feeling really excited for some bullish
innovation. I mean, Cerebris is a huge unlock in the world of AI. Sam and Greg saw it first. The
rest of the world is going to see it this week as they go public. And we'll see what the downstream
effects of it are. I mean, it's the first domino of many, right? I think,
Another obvious reason why it's getting a lot of attention is like it's AI summer, right?
We've got a bunch of cool IPOs that are coming up.
You know, I've got a graphic here, which is actually severely outdated.
It's funny.
It values Anthropics IPO as being around $230 billion.
And I'm like, have you looked at the secondary market for their shares recently?
They're selling at like a trillion dollar valuation.
People are literally leasing their homes out to be able to like fund like a purchase of
Anthropics secondary shares.
It's pretty crazy out there.
But the point being is, you've,
got SpaceX, you've got opening air, you've got ByteDance from China, you've got Anthropic that
are all planning to IPO this year. There is a lot of money that's going to be entering the market.
Some of it will be net new coming in from retail, coming in from funds that don't typically invest
in these things, but a lot of it is going to come from alternative types of companies.
Now, we've seen signals of where this is going to come from.
Whenever, for example, whenever Anthropic announces a new product or feature that replaces a certain
segment or industrial sector, you see the top market cap stocks of those sectors dump as they've
tweeted the announcement out, right? So it might be like SaaS companies are cooked or whatever
that might be, but it'll be interesting to see the market dynamics because I think there'll be
a lot of money sloshing around into these big ideas. And then the major question that
skeptics have is they don't think there's enough money. And this might be, you know, the popping
of the bubble metaphorically or financially. We might see a bunch of these like crash after the liquidity
event. That's more of the bad case, but I'm choosing to remain bullish on this because I think the
demand is pretty insatiable, right? Today, Open AI announced like a huge partnership with Enterprise
for deployed engineers. Anthropics did the same thing five days ago. So they're generating
highest revenues like Atropik hit 45 billion ARR. They're aiming to hit 100 by the end of the year.
These aren't made up numbers. So, you know, I'm excited about it. Unbelievable businesses. And yeah,
I think to your point, there may be some liquidity issues as we get to I.
IPO number four, five, and six. This is the smallest of all of, or among the smallest of all of them
that's happening. So I'm not exactly concerned about that yet. But it's going to be a
crazy year. And this will certainly set some sort of a precedent for how the market considers
IPOs. Are you buying some of this tomorrow? I have to ask you. You know what? I think it'll
depend on how it trades. I probably won't buy it at the open. Maybe wait a little bit.
You're going to wait for that 30 to 80 percent premium? It involves.
I think the answer is yes. Like, I feel like I should, I should participate, at least in the
small amount in an IPO like this. Because even if it is overpriced on the short term, it'll reprice
itself over the next few weeks. And I feel like six months from now, this company will produce
better products than it does today. And the same will also be true 12 months from now. So I'm
not much of a trader. I'm more of an investor. This is a compelling investment. And I could also see a
world in which it does get acquired or a majority of it does wind up getting bought out by another
company. It's just incredibly important infrastructure. It seems valuable. We saw what happened with
GROC. We might have an opportunity to actually participate it again with Cerebris.
I'm feeling bullish. I'm feeling like I want to get in the mix. I think I want to get involved
a little bit. Same. Same. Hey, I'm, it's, I'm down to buy Nvidia at under a $10 billion dollar
valuation. If it even achieves one of a 10% of Nvidia's market cap, you know, that would be a
win. So yeah, I'm excited to see how this plays out. Yeah. Assuming that thesis holds up where we
continue to need tokens and those tokens are able to generate a profit, the people who can
distribute and produce those tokens fast than everyone else will win. And I think that is, that is a
valid enough thesis to have me convinced. But to the listener, to everyone out there who has just
watched 20 minutes of this episode, do you agree? Are you participating in 3-Ris? Is this
overpriced? Is it underpriced? Is it stupid? Should we even be investing or considering? Let us
know in the comments down below. If you enjoy this video, please, don't forget to share with your
friends. I just any final parting thoughts. I'm actually curious whether the listeners of the
show enjoyed this episode. We do various different types of investment ones. The ones that we did last
week on the AI InfoStack got so much attention and we're going to do a follow-up episode on that.
Just we heard we read all your comments. We're going to do it. But on something like this,
is this something that's interesting? Do you want to hear about upcoming IPOs? I'm curious to
hear from you guys, but aside for that, I think we're all good.
