Limitless Podcast - Memory Could Be the Last Great Investment of this AI Cycle
Episode Date: May 13, 2026There has been a surge in memory stocks and memory ETFs, with AI inference demand reshaping the market and perhaps reshaping the entire memory industry. Micron, SK Hynix, and Samsung are int...eresting plays with their supply constraints, rising memory needs, and signs that shortages are starting to affect consumer devices.------🌌 LIMITLESS HQ ⬇️NEWSLETTER: https://limitlessft.substack.com/FOLLOW ON X: https://x.com/LimitlessFTSPOTIFY: https://open.spotify.com/show/5oV29YUL8AzzwXkxEXlRMQAPPLE: https://podcasts.apple.com/us/podcast/limitless-podcast/id1813210890RSS FEED: https://limitlessft.substack.com/------TIMESTAMPS0:00 Memory Companies5:03 Inference Demand6:52 Supply Constraints10:58 Key Players14:55 Market Conditions20:04 Forecasting23:31 Closing------RESOURCESJosh: https://x.com/JoshKaleEjaaz: https://x.com/cryptopunk7213------Not financial or tax advice. See our investment disclosures here:https://www.bankless.com/disclosures
Transcript
Discussion (0)
If you invested $25,000 in Sandysk,
you'd have a million dollars today.
Memory stocks have absolutely skyrocketed,
and the reason behind that is simple.
It's AI.
Memory has become the core component behind GPUs, CPUs.
It actually makes up 50% of the materials cost to actually build AI models today.
And just six weeks ago, a new memory ETF launch,
giving you exposure to the top three memory manufacturers,
SK Hynux, Samsung and Micron.
And just under a month from launching,
it is up over 100%.
It took in over $6 billion in assets
in the first five weeks of it operating.
In fact, Goldman Sachs referred to it
as the fastest growing ETF
that they've seen with $1.1 billion worth of inflows
in a single day.
So this begs the question,
is memory stocks or AI memory specifically
a good investment to make right now
or has the bubble grown too much?
If we look historically at memory,
it's gone under very volatile boom and bus cycle.
So in this episode, we're going to explore whether this makes sense at all.
And I think I would personally argue that I think this time is different.
Memory has undergone a significant shift of demand where AI is just consuming all available supply.
So maybe to take a step back before we get into why.
We could talk about what memory is as it is referred to around AI, which is basically high-speed chips that's physically next to the GPU.
And they hold the model weights and the data that the chips need in order to access during the inference and training run.
So this matters because these large.
models are bottlenecked by not really how fast a GPU can do the math, but how fast it
could pull data from that memory. So the speeding capacity of that memory directly determines
how big and how fast a model can run. So if you want to publish AI tokens, if you want to
generate tokens, memory is the single critical kingpin. And this is different than anything we've
ever seen before. Because for the last 40 years of time, memory was the pretty much textbook example
of a horrible business. There are three suppliers that make these ships. There's Samsung, SKHinex,
micron, and they make identical products, which are essentially a commodity. And it turned into
this thing called a pig cycle, which is basically high prices, and then everyone builds the same
fabs, and then there's oversupply, which results in price collapse. And this cycle has kind of run
itself every couple of years since the 1980s. But EJA, like you said, this time is different.
There are these two key traits that are converging that make it different this time.
So the first one is the demand sync that comes from AI for memory has vastly outpaced any
other demand sink that memory has had before. So if you look at pre-AI memory stock charts and post-AI,
it looks completely like a penny stock. It's just gone parabolic, right? And so the question
everyone's asking is, is this real? Well, the answer is, it's true. If you look at
Nvidia specifically, when the AI cycle really kicked off, their GPUs were in incredible
amounts of demand. Now, back then, the models used a significant chunk of memory. Typically around
30 to 40% of the bill of materials cost for GPUs back then for Nvidia's early models,
went to memory, basically. Now, if you fast forward to today, it's 50%. And if you look at every
future iteration, including Rubin, Rubin Ultra, and Feynman that is going to be launched later in
2008, it doesn't just require the same amount of memory. It requires 2, 3, and 4x the amount of
memory for every single iteration. So the point being is there is an insatiable amount of demand for
memory and they're different types of memory. In fact, AI has created a completely new novel type
of memory known as HBM, high bandwidth memory, which functions above the status of local
genuine DRAM, which is like your standard DRAM, which a lot of other consumer devices use,
just for AI specifically. So the point being is there are very limited amounts of memory available,
and AI is eating up pretty much the majority of it right now.
Yeah, so we have the network effects that are occurring just from AI demand,
but the high bandwidth memory is a key structural difference
that I think a lot of people aren't really aware.
If you imagine a single traditional stick of RAM,
something that you might see plugged into a computer,
you could think of it.
That's like a one-story warehouse.
What these AI chips require right next to the GPU is this thing called high bandwidth memory.
And that is a warehouse but stacked vertically.
Mostly right now it's about 8 to 12 stacks high.
soon there's going to be 16 stacks high, and that requires a lot of very precise materials,
a lot of engineering, a lot of resources that make these chips far more complex, but also constrain
the supply far more. I mean, each GPU or each stick of memory is now approximately eight to 12
sticks. So you have this convergence of a few things. One is just the need for them in order to
generate tokens, but specifically the need for that high bandwidth memory, that makes a huge difference.
So now, not only do we have the supply constraints coming in terms of the actual structural
advantages that
HBM has, but you also only
have three people that actually produce this. There are three
manufacturers, and the constraint
is so tight that they have actually some of
them stopped making commercially available
RAM because they're so focused on building
purely memory for AI.
Yeah, and there's also
a recent shift that has
happened within AI in general, where
previously it was very focused
on AI training, and so
it was very GPU heavy. That's why
Nvidia is now like the most valuable company in the world,
Now, that recently shifted towards inference.
Infference is like where you prompt or call on a model and you get an answer back.
Instead of humans doing the majority of the inference,
we're finding that AI models themselves or AI agents specifically are spending a lot of time thinking,
talking amongst each other and coming up with a better output or a better answer.
That's what a lot of cutting-edge models and AI products today actually do behind the scenes.
That requires a lot of inference.
Now, if you want to do all that amount of inference, you need not just a little bit more memory,
you need a load more memory.
So the argument that's being made right now is in a future where the entire economy is
facilitated or propped up by AI agents, you're going to need like 10 to 50x more memory than
we have available today.
And Ben from Stratakari did an amazing breakdown.
And I kind of like summarized it here where basically the value of the opportunity is like 10 to 50x from now.
and we don't currently have enough memory to do that.
And it immediately reminded me of another company,
which was Nvidia, back when they were building our GPUs
for gaming processing, graphics processing specifically,
and then they stumbled across this AI thing
where they were like, huh, it also needs parallel computing.
And now, like, the Demonsink was crazy from AI.
The same thing is happening with memory,
and these memory manufacturers are going to benefit the most.
So we're going to get into what companies are best positioned for memory.
But prior to that,
it probably makes sense to explain why this time is different, why the clock is ticking,
where there's a perhaps a limited opportunity before the market starts to realize this
structural change. Now, we mentioned the problems that are occurring in terms of just supply
side due to the invention of HBM and just the requirement of that in every single GPU,
but there's also a software side demand vector as well, which is the rise of agents. And we've
mentioned this in a previous episode, but I think it's important to double down on, is the idea
that you're not just talking to a singular chatbot anymore. And when you send a query to an
LLM like chat Shabot or like Claude, you create this thing called the KV cache and that uses a lot of
memory. It stores a lot of the context inside of this narrow window. Now, what happens with agents,
which is the new paradigm, which is what a lot of these companies are shifting to and a lot of people
who are doing productive work are actually using. Those agents, each one consumes their own KV
cash, which means it needs, if you have 20 agents, you need 20 instances of memory. And there is no sign of
that trend slowing down where there's more instances or more demand for agents, which means
more context that needs to be remembered, which means significantly more of that HBM, which is a
ready supply constraint. In fact, if you want to get some memory now, lead times are extended
currently to the end of 2027 because there's just this inability to scale production. And I think
when we answer the question of why this time is different, there are these unique structural changes,
both on the hardware demand side, but also on the software supply side, that both
point to a pretty far out window in which you should start to worry about demand because
there is just a very clear trend in one direction and a very clear under indexing of the ability
to supply all of the memory required to fulfill that trend. So I think that's something worth
noting is that there are demands for this coming from all these different areas.
There's a physical constraint to building memory. The top three providers cannot physically
build out more capacity. In fact, I was reading a story earlier
this week, where apparently a bunch of different AI customers have been offering Micron and
the likes of Samsung and SK Hynux to help them build out or purchase equipment in exchange
for guaranteed capacity or supply in 2027 and 2028, right? You're not exaggerating that like
currently they're booked up until the end of 2027. And so there's a physical constraint there,
typically in a bubble, whether it's a memory bubble or a tech bubble in general,
You have some version of these companies faking demand,
and they're levering up to create a supply which ends up becoming an oversupply.
So there's too much of the product out there,
and so prices kind of like crash demand kind of like sinks.
That isn't the case here.
If you look at every quarterly earnings report from each of these three manufacturers
who are all public, by the way, you can see that these back orders,
that these bookings are not only legitimate, but they're already paid for in advance,
which typically doesn't happen with any of these contracts.
So you have legitimate cash on the books that you can then use as a cash projection or a forward projection for a lot of these companies.
In fact, if you look at Micron, their forwards earnings right now is under 10, which isn't definable within a bubble,
where if you look in past historical cycles, that typically hits around 30 to, between 30 to 50, a forwards earnings ratio.
So the point is, it's legitimate as far as we can say.
Now, skeptics are saying, hmm, maybe.
it's the earnings that is the bubble. So the earnings that they're reporting isn't actually real,
but that's harder to pick apart than what we're seeing right now. Yeah, I was actually going to mention,
I think that's such an important point. The idea that although these companies are trading so much
higher, Sandisk is up 1,500 percent, the forward PE ratio, which is the price to earnings,
basically what premium it trades for on a relative basis versus what it's worth today, is only 9 to 12x.
And it's lower for Samsung at 7x, and it's even lower for SK-Hinex at 5 to 6x.
so they're not really trading at a premium relative to what they're guiding for their estimates
and revenue to be. If you take a company like Tesla, I think that's currently trading at 40 to 50x,
and companies like Microsoft or Amazon are like 15 to 20x. So these memory stocks on a relative
basis are actually valued pretty in line with what you'd expect for a industry that's doing
very well. And that's noteworthy. So let's get into the actual stocks, the companies that
people can participate in, or at least look at if they're interested in participating in this,
There are three large players.
There is SK-Hinex of Samsung and their Sandisk.
I'm seeing another ticker here, which perhaps you could walk us through, which is MU of
864%.
Why is that up so much?
Yeah, I mean, it's Micron.
Micron is up over 8X over the last year.
The primary reason for that is the other two companies, the biggest memory manufacturers,
which is SK-Hinix and Samsung Electronics, are based in Korea.
So if you're in the West or if you're in America specifically and you want to get exposure
to these stocks, it's not as easy as going on to your Chase stock account and purchasing
these. You can't exactly do that. There's a few more hoops to jump through, and most people
aren't willing to do that. So another way of getting exposure to this thing is to buy the main
American memory manufacturer, and there is one clear leader, which is Micron. So their stock
has absolutely said, and to be fair, in Micron's defense, they produce a really good product,
and they are one of the advanced suppliers. But if you want to talk about who the king is, it is
SK-Hinix, they have been the sole and main provider for memory for
Nvidia and all their future GPUs.
In fact, in the same way that Nvidia has booked out the majority of capacity for TSMC to
build their GPUs, they have done the same with SK.
Hynix.
So the reason why those people that I mentioned earlier are trying to pay off SK.
Hynix for guaranteed supply in the future is because Nvidia owns the monopoly on all
capacity for SK. Hynix going forward.
So SK. Hennigiegns is like the strongest.
Samsung Electronics is next.
Samsung Electronics does a ton of other stuff, which is why it's not a pure concentrated bet on memory.
And then you have Micron, which is like the American bet.
Now, what I want to speak about is that ETF I mentioned earlier, Josh, that launched six weeks ago.
That is like the fastest growing ETF, I think it's like $5 billion worth of inflows over the last five weeks or $6 billion.
It's this thing called DRAM.
It's a fantastic ticker.
And it's a memory ETF.
And the reason why it's so unique is two reasons.
one, it's accessible in the West, but two, it's a basket exposure to those two Korean companies
that I mentioned earlier on. It owns SK Heinex shares. It owns Samsung Electronics shares,
but it also owns Micron. In fact, all three of these companies, I think, make up around
70% of the entire basket. So it's a very concentrated play on memory, AI memory specifically.
And then it also owns smaller stakes in the likes of Sandisk and Westgate, which is a different
type of memory that is still equally important for the AI trade.
We have a great visual on this.
I think this is probably what most people will be interested in participating in, just showing
the holdings within this.
I mean, this is the access that you want, if you want access to the Korean companies.
It has 24% SK-Hinix, 25% Micron Technology, 5% in WD, 6% SeaGate, Samsung is 20%.
So this is probably the most balanced basket of exposure.
This is personally what I'm most interested in, if you don't want to think about it too
much, you just set it and forget it. Ejosh, you mentioned this to me a little while ago,
and since then it has gone absolutely nuclear. This seems to be the singular place in which people
are converging on if they don't want to do all the hard work of picking winners and losers. In fact,
this has recently gotten enough attention to where they have leveled it up even once more.
So now there is a ticker that goes by the name of just RAM instead of DRAM, which is the
ETF that covers basically the entire memory market. And this is a 2X leverage long memory ETF.
That is going to be launching very soon.
So if you really want to go risk on, if you really believe in the memory trade over a long period of time,
there are tools available to be able to enable you to gamble on this to the full extent.
I'll probably just stick with the regular DRAM ETF.
But for those ambitious, this is probably the best way to get access to those individuals,
particularly if you listen in the United States like most of us do,
and you want access those Korean powerhouses.
This is a pretty good way of doing it.
Now, I'm sure what a lot of listeners are thinking is, as you're,
hearing what we're saying is, okay, well, all these stocks are up five to eight X. We've completely missed
the trade. Why on earth would be potentially want to invest in this? Is this a bubble? Is this going to
pop? And, you know, there is feasibility to argue that that might be the case. This time that this time isn't
actually different. In fact, if we look at the historical context of these memory supplies,
On the left here in 2000, there were 11 main manufacturers of memory.
And it's important to understand that memory has gone through boom and bus cycles for the better
part of the last two to three decades.
And with every boom and bus cycle, you end up with fewer players, which brings us over to
the right over here, which, this is funny.
This was in 2013, Josh.
So these three have kind of stuck it out.
You have three companies which basically dominate memory manufacturing in its entire tree.
So the question now is, is this going to happen in the same way?
Now, there's two sides of this argument.
The side that argues against this thesis is AI has created a demand sync for the reasons that we mentioned earlier, genetic AI,
the fact that you needed for all successive GPUs in larger quantities, then we've seen in any other technology shift.
And that is the case.
AI is pretty pervasive across every single industry.
So if you assume that every single industry is going to have some form of AI market,
model that changes the way that the industry works or AI agents that can automate, say,
I don't know, 10 to 40 percent of a working role's there, you can then extrapolate to think,
well, okay, that's going to need a bunch of memory, a bunch more GPUs, a bunch more CPUs, a bunch more
agents to be able to facilitate that. So you can then kind of mark up the amount of memory
that is required. And that's why these stocks have probably a longer way to go. Now, the skeptic would
argue, okay, that's the case for now. But what if we launch an AI model or a GPU, which requires
much less memory, won't that mean that memory oversupply will happen?
What happens when microns fab gets launched and we have a heck tonne more memory?
Will that decrease prices in demand?
You could argue yes and no.
Jevin's paradox, people will just maybe want more of the things.
So you may need even more amounts of memory.
But those are both sides of the opposing argument.
Okay, so I guess I have something that I'd want to challenge you on in kind of supporting
why this time is different if you think this time is truly different.
Because when we look at the historical patterns of what these
memory cycles are like, they generally last about 18 months and then experience a pullback of
anywhere from like 40 to 75%. This happened in 1995, 99, 2017, 2023, and now this current cycle
started actually in the fourth quarter of 2024. So we're about month 18 now, which is where
things should be either taking a turn for the worse or changing for the better. So if you're thinking
about how you allocate personally and how you believe this to go, this is a pretty pivotal time. If
you're looking at it as if there's like resistance on the chart, we're at that resistance right now.
We're in month 18. Does it seem like it's more probable that this actually is different?
Or is this the natural cycle? Because I mean, all the signs point to it being different this time.
But you mentioned, I mean, the ETF had doubled in five weeks. So a lot of that like easy beta that people
had is gone. Is there still opportunity for the upside and what time scale are you thinking about?
If you ask me personally, I do think that there is a long term a lot more upside. But it wouldn't
surprise me if we see a short-term retrieval from stock price charts. Like, we have had like,
I think, six weeks of constant uptick in this DRAM ETF and in just memory stocks in general.
And that has been on the basis of these quarterly earnings and a bunch of other partnership announcements.
That can't keep happening every single week. At some point, we're going to run out of news
to kind of push prices up further. And I think we're going to see a recession. I think people
are kind of overthinking it. Now, the argument against that would be, well, Ford's earnings
are still quite low. And, you know, some people might see that, but I just don't think we can see a chart go up exponentially every single week. I think we're going to have some backpole. The thing that gives me optimism is those backlog of orders that I mentioned earlier that aren't just sign sealed. But before they even delivered, people have paid for it up front. There was a recent deal that was struck between, I believe it was Google and Micron for their TPUs. And they paid,
40% up front. So the point is, like, these companies really want memory, and the companies
that really want memory are not unknown companies. They're the Mag 7 who have the cash flow and
free cash flow to pay for it and back it up. So if you have some of the smartest, most intelligent
companies, most valuable companies in the world, putting up money and big risks like that,
you could argue that this time is different and more legitimate. Now, of course, there is the non-zero
chance that all of this blows up, that the demand is actually fake, that anthropic and open AI on
their demand side actually end up inflating their numbers too much and that could lead to a waterfall
cascade of all these things crashing. But I don't see the structural context for that happening
right now. 12 months from now, higher or lower? Higher. I think higher too. Yeah. That feels like the vibe.
There is like very clear trajectory and sure there will be a bit of volatility along the way.
But higher feels like a, feels like an optimistic version of the future that we have enough
information to go on that I can feel comfortable about betting on. So I think that's probably
most of the case for memory. That's why it's important what unique market structures are now
available that weren't in the past version of these 18 months cycles and where the opportunity
lies if you were interested in participating. I guess the ask for you, the person who's listening to
this, is what do you think on that 12-month time scale or what is the best way for you to get exposure?
There's this chart that really devastates me personally, EGIs.
As someone who loves to create my own custom PCs and build hardware, the cost, yeah,
if you scroll down all the way to the bottom even, the cost of these memories or memory systems
for consumer products is through the roof.
And that's been a little disappointing because if you want to just go buy a PC,
if you want to get a new Xbox, even the new steam machines, they've all been delayed or
the prices have been increased.
In fact, if you want to go buy a PlayStation 5, it costs now today.
than it did, what it was it, five years ago when it first came out because of these supply constraints.
So we're starting to see it permeate out into the general broad consumer market.
And that feels a little discouraging.
I wonder how high that can go.
I wonder how much tolerance the consumer market has before it starts to bend.
So that's something I'm going to be looking at also is following these prices as it relates
to just general consumer hardware.
Is the iPhone going to suffer?
Are those prices going to go up?
The average smartphone costs.
Things of this nature are also on the watch list.
So see if they could bear it.
And if they can, then we're probably good.
If not, it's something to just take some note of.
Personally, memory is going to be one of the biggest bets in the same way that GPUs were.
And it saw Nvidia breach above $5 trillion market cap.
I think the same thing is going to happen in memory this time.
And we have three concentrated players, one in America, Micron, that is going to do the same thing for memory.
So I'm excited about it.
I am slightly nervous about the boom and bust of these different cycles.
but I guess we'll see if this time is different.
It's getting high, man.
Yeah.
Things are going up.
Well, at some point it's going to break, right?
Like, I'm looking at this like consumer PC memory chart and I'm like, at some point
people are just going to throw their hands up in the air.
Like, it's going to be too expensive and something will need to shift, whether it is
these consumer device makers like Apple pushing back and saying, listen, we're just not going
to do this or we need to find an alternative.
I don't know what that is.
but one thing is for sure is the demand for AI models.
The demand for AI agents is just going up and to the right.
If it's not coming from consumer, it's coming from Enterprise.
So I just don't know how this conundrum is going to be fixed.
Maybe we create an alternative type of model,
but either way, we will be tracking it,
and I'm bullish memory at least for the next six to 12 months.
But that is it.
I'm curious whether the listeners of the show have any other takes.
We've released, I think, two other investment-themed episodes
over the last week and a half,
and the feedback has been tremendous.
Both comments telling us why we're right
and why we are explicitly wrong.
I want to hear from both sides
because it helps us kind of like figure out
where we're going next
and what to cover next.
Does anything that we missed
on the memory set of things
also let us know?
And finally, we heard you
on wanting to talk about
or unpack the layers of substrates
and infrastructure that we covered
on our previous episode.
We are going to plan
and deliver that episode fairly soon.
But this week, later on, we're getting Leopold's firing.
Happy Leopold week.
Happy Leopold week.
It's huge.
Our last two episodes, those were among the biggest ever because it was covering Leopold.
I don't want to say, I don't want to take any credit.
But if you search up Leopold's portfolio or if you go on X, we're very much the beginning
of the Leopold narrative.
And now it is the next chapter in that.
So Leopold's fund situational awareness has gone from $1.5 billion to like $6 billion in the
course of two years.
and they publish these things called 13F filings every quarter.
That reveals the new set of positions.
That 13F filing is going live sometime by the end of this week.
It's due by Friday.
So we will be very ambitiously monitoring this.
And as soon as it releases, we'll have an episode ready to go
to talk about all of the new investment thesis
around situational awareness for the next quarter.
So stay tuned for that.
We are locked in and ready to go as soon as that one drops.
Yep, exactly.
But aside from that, I think that is it.
And we will leave you guys until the next episode.
but see you that
