Limitless Podcast - The Semiconductor Squeeze is Quickly Becoming The Biggest AI Crisis
Episode Date: January 6, 2026There is an urgent memory crisis in the AI industry as 2026 begins, highlighted by graphics card prices skyrocketing from $500 to $4,000. We reveal the $100 billion memory supply gap and its ...critical implications for AI performance and consumer electronics. Key players like NVIDIA, Google, and major memory manufacturers struggle to meet demand, while Apple’s upcoming A20 chip raises questions about pricing strategies. Join us for insights on how consolidation among DRAM suppliers and fierce competition shape the tech landscape, and the potential effects on consumers moving forward!------🌌 LIMITLESS HQ: LISTEN & FOLLOW HERE ⬇️https://limitless.bankless.com/https://x.com/LimitlessFT------TIMESTAMPS0:00 The AI Memory Crisis3:12 The Rise of Memory Prices6:21 Apple vs. NVIDIA: The Memory Battle9:17 The Memory Manufacturing Giants13:22 The Shift in Market Dynamics14:37 The Future of Memory in 202618:05 The New Gold Rush for AI18:59 Your Stories: Impact of Rising Costs------RESOURCESJosh: https://x.com/JoshKaleEjaaz: https://x.com/cryptopunk7213------Not financial or tax advice. See our investment disclosures here:https://www.bankless.com/disclosures
Transcript
Discussion (0)
We're not even a week into 2026, and AI's biggest crisis is already underway.
The battle for memory.
What if I told you this ugly piece of plastic, a graphics card, is now worth $4,000 when
it was only worth $500 literally two months ago?
The reason, AI is consuming the entire memory chip supply, and it's leaving behind a $100 billion
hole that needs to be filled ASAP.
You see, memory is a really crucial component for AI models.
It helps them store and access intelligence, very much like how.
our human brains work. In fact, it's so important that companies like Nvidia and Google spend
tens of billions of dollars each year just on memory to build their GPUs. The issue, though,
is that manufacturers of memory like Samsung and Micron are running out of supply, which means that
prices for these things are skyrocketing. In fact, it's so bad that companies like Google
and Microsoft have each fired their executives that live out in Southeast Asia because they
weren't able to secure capacity for 2026.
Everyone thought that GPUs was the main commodity to win the AI race,
but it turns out it's actually memory.
Josh, you're kind of on the front line for this kind of stuff.
You've had experience, like, building out custom PCs and stuff,
and you've been tracking the memory prices for a while.
Now, what's your take on this?
This story for me hits very close to home because I've spent so much time building
custom computers, custom PCs.
I love gaming.
That was my thing.
And seeing this is really disappointing because throughout history, there has always been these reasons why my GPUs have gotten so expensive.
First, it was for like crypto mining.
Now it's for AI.
And all the supply chain crisis that is downstream of PC gamers is such an annoying thing to deal with.
There's this cool chart that I'm going to show on the screen right here, which shows the DDR5 RAM over time.
And I guess what I could do is kind of highlight why you need RAM in a computer.
There's a few parts.
There's a case that holds it.
There's the power supply.
There's the GPU and the CPU, which you think of it.
the brains, and then there's memory. And there's two types of memory. There's one that you could
think of as short term, which is RAM, and there's one long term, which is your storage. What we're looking at
here is the price of RAM. RAM is the short-term fast memory that's required to do a lot of heavy
compute things. So if you think of your computer as a kitchen, maybe it's like the GPU and the CPU
is the chef. The RAM is the countertop space where the ingredients are laid out, and then your storage
is in the pantry somewhere else. The problem is that this countertop space where you quickly work on
is very valuable. And a lot of people really want that precious countertop space. And what we're
seeing here is the price of a 232 gigabyte sticks of RAM, which is very standard for a computer,
going from $200 to $800 over the last couple of months. And there's this great example of this
post from this guy Levels, who we'd love following on Twitter. He was talking about his experience,
where he actually bought two sticks of RAM for 64 gigabytes a few months ago at $350. And now they are
$2,500 for that same exact thing. The market is incredible. So if you are buying a PC,
building a PC, or if you're buying any sort of consumer hardware, the idea is that these costs are
going to have to find their way into the market somehow, and they're probably going to be hitting
you in the wallet right where it hurts. So I want to pause for a second and kind of rewind five
months ago to when the rumblings of all of this began. Because to be frank, this kind of took me by
surprise. I didn't realize that memory was such a crucial component, more so that there was a supply
crunch for it. So around five months ago, we were kind of like winding down the year. You know,
Open AI probably announced their thousandth partnership and Nvidia launched their next GPU or whatever.
And we were reaching a point where some analysts on Wall Street started to sound the alarm on prices
of memory going up. But it didn't appear in GPUs or in any part of the AI world, Josh. It started
appearing in consumer electronics or the cost of these graphic cards that you were just talking
about. And so they started saying, well, this is going to eventually trickle down into GPUs because
they require a lot of memory. So therefore, GPU costs is probably going to go sky high.
So I started looking into this and this tweet really summarizes this well. Did you know that
80% of the average material cost to build a GPU is memory? That is just an insane amount of like
materials that you need to kind of like hike this thing up. And so,
So when you're looking at the cost of prices of these consumer electronics going up, such as we have on the screen here, it starts to really put into place that it's not just an AI specific thing. This memory is required for pretty much any consumer electronics device that you have available. Yeah, I, as you mentioned earlier, that you were tracking the prices of GPUs. You had one that you saved for $2,000. And now the price is like upwards of $4,000 and you have to update your reference now because the prices have gone up so quickly. And I think I want to make this important clarification that it's not just RAM.
It's a specific kind of memory that AI needs.
So when people say RAM shortage, they mean the sticks you buy for your PC, which are DDR4,
DDR5, but the AI world has its own special type of fuel, which is different than the things
you plug into your computer.
And that's called HBM.
It's high bandwidth memory.
And you could kind of think of it.
It's like the Formula One pit crew of memory.
It is the fastest thing that exists because it takes two-dimensional RAM, which is generally
reserved for DDR4, DDR5 things you put in your computer.
And it stacks them together in three dimensions to add a lot of bandwidth,
with a lot of capacity, a lot of additional speed
that you wouldn't otherwise find
in these traditional pieces of RAM.
And the idea is that there's a lot of downstream effects
on consumer products that come from this specialized
HBM being absorbed by all these manufacturers.
So basically, each HBM is composed of a stack of DRAM,
dynamic random access memory, as you said.
And it requires a lot of this, Josh.
In fact, to create one unit of HBM,
it requires three times the capacity
that it requires to build regular DRAM.
So why am I talking about these two things?
Well, DRAM is what you need in pretty much every single electronics device,
including, drum roll, the Apple iPhone.
So if you start to think about it,
these companies need to start competing for the very supply
that is limited and that we're talking about right now.
And so when I think about the likes of Nvidia,
Justin Huang, who needs all this memory and DRAM to build out his GPUs,
and then Apple, surely there's going to be some kind of price hike
that levels up into the consumer.
And that's what we're hearing on the rumor mill here.
So Apple is releasing their new A20 chip this year, which is the upgrade from the A19 chip.
And the rumors say that it's going to cost 80% more than the A19 chip.
Now, I don't see a world where Apple doesn't pass this cost down onto the consumer,
because there's now massive competition between basically Nvidia and Apple as to who pays more money to secure the capacity.
Now, Apple isn't someone that has a small wallet.
They have a very large budget.
They're able to secure this supply.
In fact, they accounted for 24% of TSM's revenue in 2024.
And Rumas has it that they've secured 50% of TSMC's memory packaging capacity in 2026.
So I think they're still able to compete, but for how long I'm not entirely sure.
There's this great example that we're showing on the screen where it shows a 64 gigabyte memory package compared to a MacBook Air.
And the funny thing is, the prices are pretty much exactly the same.
In fact, the MacBook Air only costs $15 more than the sticks of memory.
So with Apple, you can essentially buy the RAM and get the MacBook for free because it is all packaged at the same exact price.
And it shows that Apple does have this resistance to price impacts felt throughout the market.
How long will this last?
I don't know.
I have to assume that Apple, like you said, they're very well capitalized.
They have the ability to shrink their margins temporarily in order to gain more market share across the world.
And if this is the case and if people are either looking to buy a PC,
or to buy a MacBook, and the MacBook is the cost of a single component of the PC.
It's a very strong and compelling argument to buy Apple products.
So how much of this increase is going to be felt throughout the products?
I don't know.
I guess the main thing we'll probably see is later this year with new iPhones,
but they also have some series of Macs that are coming out early this year,
and I guess we'll just have to wait and see if they're able to maintain this,
to hold the line, because what a great deal now.
If you're buying a computer, go buy a MacBook.
It's the same price as a single component of a PC.
I want to push back on one thing, Josh. These are the MacBooks that exist today, right? The models are already out. The supply is already out. But wouldn't these price changes be seen in the future products that they release? Like the next MacBook that they drop, the next iPhone that they drop. I think that's what we're going to see the price heights. Am I missing something here? No, it's possible. We'll see. The M5 MacBooks are coming out probably the newer ones are coming out in the next few months. It's rumored to come out in Q1 of 2026. So,
we will have to wait and see.
Historically, Apple has been pretty good at resisting these fluctuations and smoothing them out over a long period of time.
They might eat the cost.
Perhaps it's incremental.
Perhaps they eat the cost in exchange for getting larger market share.
We'll just have to wait and see.
But they can certainly afford it whatever issue may come their way.
What's interesting is Jensen is not eating the cost.
He's passing that memory price hike of cost straight down to the consumer.
So I think the average cost of his GPU was like 35K.
He's now selling them for 45K.
with hefty margins.
Hefty, hefty, 80% margin.
Don't ever forget that.
Envidia is a huge monopoly.
But speaking of monopolies, actually,
I think now's the perfect time
to introduce the key players
in the memory manufacturing game.
Now, let's call these the three musketeers.
These guys have massive grins on their face
for many different reasons.
Number one, they are the primary
and only providers of high bandwidth memory.
and DRAM, which is what both Apple and Nvidia need to build their respective products.
And Josh, what I'm showing on the screen right now is the timeline of memory manufacturers
I think roughly over the last 25 years or so.
And you'll notice that when you pan from left to right, for those of you who are just listening,
you go from about 11 players in 2000 to three players in 2013.
And these three players, Samsung, SK. Hynix and Micron are the major players that we're going to be talking about today.
They've had a fantastic 2025, Josh.
Do you want to guess what the average price increase in their share prices has been over the last year?
Looking at this chart, we lost 70% of the players over the last 20 years,
which means the remaining 30% are capturing 100% of this unbelievable demand in the highest priced index and industry.
the world, which means surely they're up a good amount. How much? I don't know, but I would guess
like perhaps Robin Hood or Palantir levels above. You might be right. So I'm showing Micron
technology, which is the U.S. ambassador of memory supply right here. We're a go USA over here.
Their stock is up 250% over the last year. If we peek over at Samsung electronics, which is the largest
market cap already out of all three of these companies, their stock is up a massive 150% over the
last year. And the same is for S.K. Hanukes, which is a Korean or Southeast Asian-based company.
The point I'm making is these guys are in the perfect position because they're the ones that are
able to hike the prices up and say, hey, sorry, InVidia, you've got to pay 80% more,
either take it or leave it or I'm going to sell it to Apple. And Jensen's like, fine. I'll spend
that money. I'm fine with that. And Apple's doing the same as well. So not only is all their supply
booked up, but their supply for the next couple of years in 2027 to the end of 2020s is also booked up.
So then the question becomes, which one of these three players are going to fill the $100 billion supply hole that is currently there?
It's going to be a race between the three of them.
Whoever can fill it will be the richest and be king made out of this entire memory manufacturer race.
My bet, Josh, is it's going to be Samsung.
I have two specific reasons for this.
Number one, Samsung has been the biggest memory provider for over a decade now.
And they've been able to navigate this market through memory cycles up and down for decades now.
They have all the experience and funding to be able to do so.
Which brings you to my second point, they're known as what is called a Chable in Korea,
which stands for basically monopoly.
They're able to pull funds from all of their other cash-making sectors of their business,
their electronics business, the mobile phone business, to keep the memory business alive.
And even if it runs to a zero-cost margin type of race,
they'll still be able to win and survive.
And the truth is, whoever survives the memory crunch that they're currently in will end
up being the winner.
So I think Samsung's got a lot of legs here.
So that's the case for Samsung, but we also have a bold case for Micron, which actually
exited the consumer business entirely.
So one of these three major manufacturers left the consumer business, meaning, if you used
to buy crucial RAM, which is actually the memory that's in my computer, they no longer
exist. They said, see you later. We're going for the big boys. We want the big bucks with these
AI companies. And it's devastating for the consumers because this is where you're really seeing
the price increase. One of these three major players is now gone. They're just catering to the
large people in the market, but it also means that they are focused on bringing rates down.
So if you are in the industry for a custom PC or any sort of consumer hardware, the apologies
need to be made. But also, the hope is that them doubling down on this will mean that they can
produce a lot more and hopefully lower the downstream cost to these AI providers, which will then
lower the cost down to you, the end consumer. But they're locked in. If you ever wanted any
indication that they're coming for Samsung's neck, this is it. They left the consumer market. They're
all in on AI. And hey, give them a lot of credit. We'll see what happens. I think what amuses me the
most is just the role power that these companies wield. There was a new story that broke last week
that S.K. Hinex, the third player in this memory game, told Microsoft no to their extra requests
for capacity in 2026 to build their own ships and supply open AI in many different ways.
They also told Google the week before, no as well, which led to that Google exec being
firing or reports of him being fired. So the point of the matter is these three companies are
going to control the spice for all the Dune fans out there of whether you can.
can build GPUs or whether you can build iPhones at a reasonable cost without passing that on to the
consumer. But it's equally on them to be able to scale supply to be able to meet demand. And that is
going to be a really important battle to track in 2026, which is probably a good point to transition on
to the kind of future facing section here, Josh, which is like, how does this play out in 2026 and
what are kind of like the key themes that we're going to see? I think the main one is going to be,
well, there's going to be a lot of fights between all the AI labs to get.
their hand in V-VDiA to get their hands on memory capacity.
And the companies that are able to do this and navigate this well
will end up being the winning AI companies in 2026 potentially.
Whatever it may be, Open AI has taken the first punch
or rather they've delivered the first punch,
reportedly locking in 40% of global DRAM wafer capacity supply
through 2029.
I don't know how true this is,
but I have a feeling this is linked to all the partnerships
that they were signing with Oracle and Mbedia and stuff like that. So, you know, it might be a bit
kind of fluffy, but interesting to see. Yeah, we find ourselves in this interesting situation
when we're always on the lookout for these bottlenecks. Where are there going to be problems
as we scale these systems? And right now, the largest and most important one, outside of energy
is RAM. But now we have two. Now we have this memory issue. We have this, we have this energy issue.
We are like kind of accumulating these problems along the way to the point where now Ram is
becoming close to worth its weight in gold. If gold hadn't just gone up like 30%, it would have
been. And just today, it was announced we have this post on screen saying that the prices for
EWS for Microsoft for Google Cloud, the ramp prices are 70% higher than in the fourth quarter of
last year. So the ramp up is incredibly high. And it's showing, I guess, how durable the industry
is. Where they're ready to just absorb this and keep going. That's how important this progress is.
So things are getting a lot more expensive.
And there is a new gold rush on the block.
And that is for random access memory in high bandwidth.
So they could deliver the tokens to you.
And finally, everyone knows the story about
Nvidia spending $20 billion to acquire the licensing rights
of this company called GROC,
which also makes chips for AI models.
But the story that everyone missed
was the fact that these chips are made
with a very specific type of memory called SRAM,
static random access memory,
which is a different type of memory to DRAM.
So you can imagine that in a world where DRAM prices are skyrocketing
and everyone's dependent on DRAM,
having a chip that's made of a different type of memory
that costs a fraction of the price of the competing memory type
is probably a good thing.
And Nvidia bought what is pretty much the only $20 billion,
get out of JL-free, very expensive.
of get out of jail free card that was available.
So, InVIDIA, even if DRM prices continue to increase to an absorbent amount or a crazy
amount, they have this way out to basically still support scanning of their GPUs without
hiking the costs too much.
Just a masterful chess play from the monopoly.
Yeah, Jensen is, he's on fire, man.
Every decision that he makes, he seems so calculated, he seems so aware of
where the puck is headed to, and acquiring GROC and getting themselves all of this power on the
inferencing front is such a huge deal because now there is no real threat. They've absorbed the
threat and they've made it their own advantage. So if I had to summarize this, the price of things
are going up. Why? Because memory is in short supply. And not the kind that you plug into your
computer as a hard drive, you put your photos on, but the kind that allows your computer to think
remarkably quickly. And the fastest version of this, this high bandwidth memory, has become the
new gold rush for AI companies across the world who want to generate tokens as fast and efficient
as everyone else. It is a new bottleneck that we need to monitor because there are now an
increasing amount of things that can go wrong. So we'll be keeping a close eye on this, how resilient
these companies are to that 70% price increase over the quarter and how the consumer market's
going to act. As me, being like a gamer, someone who uses a PC, this sucks. Things are a lot more
expensive now, two to three thousand dollars more expensive per computer, but we'll just evaluate
the situation and see where we stand. If you have built a computer in the past, or if you are
affected by this, or if you think the price of an iPhone is going to go up, tell us how much,
tell us your stories. I'm so curious to hear the firsthand accounts of how people are impacted
by these things outside of the general scope that we talk about here on the show. So please
share that in the comment section. Share this episode with your friends if you enjoyed it.
And don't forget to like and subscribe wherever you are getting your podcast here.
The cost of this show has officially gone up 500% in the last two months because of the memory required to run this entire show.
So guys, if you want us to still be alive and pump out three to four episodes a week, please like, please subscribe.
We've got an awesome news that we're dropping essays and highlights twice a week now.
Best source of information you can get.
And that's it.
So with all that said, thank you very much for watching.
and we'll see you guys in the next one.
See you guys.
