The Journal. - The Tech CEO Leading Nvidia's Main Rival
Episode Date: December 9, 2025Earlier this year, OpenAI and chip-designer Advanced Micro Devices, or AMD, announced a multibillion-dollar partnership to collaborate on AI data centers that will run on AMD processors, one of the mo...st direct challenges yet to industry leader Nvidia. WSJ’s Robbie Whelan spoke to the CEO of AMD Lisa Su about the deal, her company and the prospect of an AI bubble. Ryan Knutson hosts. Further Listening: - CoreWeave, the Company Riding the AI Boom - Is the AI Boom… a Bubble? - The Unraveling of OpenAI and Microsoft's Bromance Sign up for WSJ’s free What’s News newsletter. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
In the world of AI computing, one company stands at the top is king.
Invidia.
Invidia.
Invidia.
Invidia.
Can you characterize how big and how dominant Invidia is?
Yeah, Invidia, most people think that invidia controls 90% or more of the advanced AI chip market.
That's our colleague Robbie Whelan.
He covers tech.
And I can't think of another industry
where there's such a market concentration.
It's not that Nvidia has chased out the competition
or has some sort of nefarious strategy
to make it impossible for people to compete with them.
It's that they were very early first movers
in this idea that these chips
that used to be primarily used for video games
were also really, really good for doing
AI computing.
As the AI industry is boomed, more and more companies are developing their own AI chips
and coming for Nvidia's crown.
But the company they might have the best shot is a small one, one you've probably never
heard of, advanced micro devices, or AMD.
So AMD is very much the David to Nvidia's Goliath, and the CEO of AMD, a woman named Lisa
She has a really good track record of slaying giants.
She's done it before, but she's never met an opponent like NVIDIA.
NVIDIA is truly a Goliath here, and the idea that she's going to take them on,
it's very audacious, and it's very exciting for a lot of people who watch this industry.
When Lisa Suu took over AMD a little over a decade ago, its market cap was less than $3 billion.
Now, AMD is worth more than $350 billion.
And there's no sign of slowing down.
A few weeks ago, AMD scored a major coup
when it inked a massive chips deal with Open AI.
Recently, Robbie sat down with Sue to talk about the deal,
the company, and why she believes
were only scratching the surface of the AI revolution.
Where are we in terms of focus right now
about going forward and how to get to where you want to be?
And where is that that you want to be?
Well, we're in a very special time.
So we are probably going faster
than we've ever gone before.
I mean, I certainly believe
that the technology is moving faster
than I've ever seen in my career.
And, you know, the role of AMD
is to enable all of that
with the foundational computing.
Welcome to The Journal,
our show about money, business, and power.
I'm Ryan Knudsen.
It's Tuesday, December 9th.
Coming up on the show,
the CEO taking on in video,
and why she's not worried about an AI bubble.
This episode is brought to you by Fidelity.
You check how well something performs before you buy it.
Why should investing be any different?
Fidelity gets that performance matters most.
With sound financial advice and quality investment products,
they're here to help accelerate your dreams.
Chat with your advisor or visit Fidelity,
dot CA slash performance to learn more.
Commissions, fees, and expenses may apply.
Read the funds or ETSs prospectus before investing.
Funds in ETS are not guaranteed.
Their values change and past performance may not be repeated.
Lisa Sue rose to the chips industry as an engineer.
Early in her career, she worked at IBM in Texas Instruments.
She's an engineer who has a PhD in electrical engineering from MIT.
She's deeply interested in what's called device physics,
which is sort of the marriage of gadgets and hard science.
So she understands how a piece of silicon inscribed with transistors
translates into the computing power that appears on your screen
when you're a software developer.
She understands it perfectly.
Sue joined AMD in 2012 and took over his CEO two years later.
And soon after, she earned a reputation for making bold bets.
AMD is an American company that's been a long-time player in the chip space.
It was founded in the late 1960s, and it's based in Santa Clara.
When Sue joined the company in 2012, it was mainly focused on chips for individual computers and electronics, known as CPUs.
But when she took over his CEO, two years later in 2014, she saw some holes in AMD's strategy.
She looked at the product line of what AMD was making, and she said, we're not focusing on the right things.
we need to be focused on accelerated advanced computing
because that's what our customers want.
She would go to these meetings and they would say,
it's great that you guys make computer chips
that power PCs or mobile devices,
but what we really want is a data center server
that can help us take our products to the next level.
Okay?
So she totally revamped AMD's product line
and AMD started making these chips
that are essential to the data center.
Sue started focusing on the chips
that were critical for cloud computing,
an area that exploded over the last 15 years.
And the strategy paid off,
and it helped AMD become a major player
in a market that had long been dominated by Intel.
She positioned herself in a way
that she was in the data center,
she ate up Intel's market share in data center computing,
and she really leapfrogged Intel.
But the pivot proved consequential for another reason.
It gave AMD a foothold in data centers,
which would become critical in the rise of AI.
After the launch of ChatGPT in 2022,
Sue saw an opportunity to pivot the company again.
They were having a board meeting in late 2022.
And so she comes to this board meeting,
and she says, look, I've had this epiphany.
I'm going to pivot the entire company.
She says, you know, artificial intelligence is rising.
It's a once-in-a-lifetime opportunity,
and we are positioned in a very special way
for us to take advantage of that.
She said, we're going to revamp our entire product line
so that it's now all oriented around artificial intelligence.
And this was a major turning point.
This time, Sue wanted to take aim at Nvidia.
Back then, Nvidia wasn't a household name,
but it was clear to Sue that it was on the rise.
ChatGPT was trained on Nvidia's chips.
It took seeing a product like ChatGPT
for most people in the world
to realize how big NVIDIA was going to get,
how big AI was going to get,
and I think that that was the real talent
that Lisa Sue has that shined through in this moment.
She said, this is going to change everything,
and we have to be in front of it.
We're going to bet the whole company on it, basically.
Exactly. We're betting the whole company on it.
By the way, NVIDIA's CEO, Jensen Huang,
and Sue were actually distant cousins.
They only met as adults.
Fast forward, you know, two to three years,
how does that pivot go?
Like, what does she actually do?
She goes top to bottom and revamps AMD's entire product line.
She rolls out three or four generations
of what's called the Instinct Series GPU.
So this is the first time the AMD in the modern era,
in the AI era, is directly competing with NVIDIA.
And to be honest, they have a hard time competing with NVIDIA.
The first few generations of the instinct were generally regarded in the market
as not as powerful, not as easy to use as NVIDIA's equivalence.
But that's okay for a company like AMD,
because what they're doing there is they're trying to get a foothold,
they're testing out these products,
and they're trying to find customers to sign up and commit to using them.
To take on NVIDIA,
Sue decided to focus on chips that were one step ahead.
Rather than making chips that are good at training AI models,
which has dominated the AI market so far,
Sue wanted to make chips that were good at inferencing.
What does that mean?
What is inference computing?
So in AI computing, there are two main functions
that people need to utilize.
When you're developing an AI model or an AI tool,
whatever it is, be it a chatbot or a video generation app,
you have to first train it,
and then you have to make it capable of responding
to queries, which is to say you have to run it.
And the running it is usually referred to as inference, inference computing.
Inference is when you or I sit down and says, hey, chat, GPT, what should I eat for lunch
today, and spits out a few recipes for us.
That is inference.
It's querying the models, and it's running them.
That is the most basic.
The training is the learning part, or when an AI model is devouring everything it can.
and the inferencing is the doing, the actual responding to prompts.
This decision is starting to pay off.
In October, AMD announced that massive deal with OpenAI
to help it run inference functions for ChadGPT.
Under the deal, OpenAI agreed to buy a ton of AMD chips starting next year.
And in return, OpenAI could get as much as a 10% ownership stake in AMD.
And when this announcement was made, AMD's stock shot through the roof.
They saw huge gains in their share price, and everyone was very excited about it.
Shares in the chipmaker AMD have surged over 30% today on news of a huge new tie-up with Open AI.
And basically, the reason why everyone was so excited was because here you had what most people regard as the best company in AI.
development, which is Open AI, not jumping ship entirely from
Nvidia, but saying, look, AMD is just as good for us
to operate our models, so we can use their chips,
we can pay them a tremendous amount of money to buy them,
and it's not just Nvidia anymore.
They're not the only game in town.
Yet amid all the excitement and money,
there's also concern about an AI bubble,
and the AMD deal with OpenAI raised questions
among some investors.
The AMD Open AI deal, it wasn't all Sunshine and Roses.
There were some concerns as well that came up
and they surrounded the financing of the deal.
And they sort of pointed to this idea of,
is this really a wonderful deal for everyone involved
or does it maybe indicate that we might be
in a huge, dangerous financial bubble surrounding AI?
After the break, Robbie sits down with Lisa Sue
to talk about the concerns about an AI bubble.
Get you and your crew to the big shows with Go Transit.
Go connects to all the main concert venues like TD Coliseum in Hamilton and Scotia Bank Arena in Toronto,
and Go makes it affordable with special e-ticket fairs.
A one-day weekend pass offers unlimited travel across the network
on any weekend day or holiday for just $10.
And a weekday group pass offers the same weekday travel flexibility
from $30 for two people and up to $60 for $5.
Buy yours at go-transit.com slash tickets.
Okay, well, without further ado, just jump right in.
Yes, please.
Recently, Robbie flew to a big AMD office
in Austin to meet with Lisa Sue.
So can you just start by tell me, say your name and title,
and where you grew up and how long you've been in your current role?
Lisa Sue, CEO of AMD.
I grew up in New York.
Born in Taiwan, grew up in New York,
and I've been in this role.
One of the main things Robbie wanted to talk with Sue about
is the AI bubble,
and specifically whether AMD's deal with open AI might contribute to that bubble.
Under the terms of the deal, AMD gave OpenAI a big financial incentive to use its chips.
If OpenAI hit certain milestones for deploying AMB's chips,
it has the option to buy AMD shares at a steep discount at just one cent per share.
If you're wondering why it sounds like money's going out the door from AMD to a customer
and then coming right back in the door to buy their chips,
it's because this is an example of what people have recently started describing as circular funding,
or circular financing.
Circular financing.
Circular financing.
Circular financing.
The circular financing, right,
where these companies are lending
to other companies
who are going to be buying their products.
And so with the amounts of money
that are being tossed around,
the numbers that are being tossed around,
when people see circular financing,
you know, they start to get worried.
They start to think, is this a bubble?
Is this all going to come crashing down?
Are these companies ever going to be profitable
and are the people who funded them
going to totally lose their shirts?
Sue defended the deal in her conversation with Robbie.
We structured this deal so that there was complete alignment of incentives,
complete alignment that we wanted to go fast, that we wanted to go big,
and that if OpenAI is successful, AMD is successful,
because they will need lots of GPUs.
And the opposite is also true, which is if AMD is successful,
Open AI gets to share in some of that upside.
Given the current phase that we're in of the AI boom
and the market conditions, do some of these big deals have to be done in this way that feel
sort of circular to investors? Look, I don't think you have to do anything. I think people make
choices. There are plenty of deals where, you know, we are basically selling GPUs. I think the
opportunity is to partner deeply in the ecosystem. There are, you know, different opportunities to do
that. And we've chosen a mechanism, which I think really ties the companies together in a way,
that, you know, they care about my success. I care about their success. And everyone wins in this,
you know, fashion. If for whatever reason, the GPUs are not deployed, then there's also no
equity that goes along with it. And so from that standpoint, I just view it as a way to tie
ourselves closer together. And I think that's actually a good thing. But if I were an investor,
I could see myself thinking, this is great. The stock price is going up. It's accretive. But I
kind of wish we didn't have to give away 10% of the company to get it done.
What would you say to investor who said that?
I would say you're probably undervaluing the importance of the partnership.
That it is not a pure transactional relationship.
It's one where we are, you know, deeply developing the future of AI,
and that's highly valuable.
Is she at all concerned about an AI bubble?
Yeah, so I said to her a lot of people are saying that you deal with OpenAI.
is going to contribute to the sense that there's a bubble.
And her response was essentially, I'm not worried about an AI bubble.
And she repeated this mantra she has,
which is basically there's insatiable demand for computing.
And that's why I'm so confident
that even if it takes unusual, unorthodox financing at the outset,
that things are going to turn out okay
and that we're going to benefit in a big way from all this demand.
We should see this kind of investment
because this technology does have so much potential,
and we are barely scratching the surface.
And frankly, what I'm seeing is the people who are being rewarded
are those who are willing to take big, bold bets.
And this is not the time to stay on the sidelines
and worry, hey, am I over-investing?
It's much more dangerous if you under-invest than if you over-invest,
in my opinion.
Sue says that once AI models are up and running,
and people start relying on them,
there'll be almost no limit to the future demand
for AI computing.
She estimates that the overall market for AI
could be worth $1 trillion a year.
But isn't Sue the head of an AI chips company
going to say that there's never-ending demand for chips
the thing I happen to be selling?
That's exactly right.
So the bare case here is that OpenAI
and other similar customers
are going to spend way too much money on silicon,
and then they're going to never produce the kinds of profits and revenues that people expect them to,
and everyone kind of goes down in flames.
Because everyone up and down the supply chain,
including the people who are supplying the silicon to these software developers
and the backbone of this computing power that everyone needs right now,
that that demand will evaporate when eventually the gravy train stops
and people realize that there's not enough demand in the market for the end product.
And so everyone up and down the supply chain,
suffers. And you can easily see a world where if the bubble did pop, if there is a bubble and
if it does pop, then companies like AMD and Nvidia will really be directly in the firing
line. For AMD, the big money will be coming from inferencing, the space she's pivoted
the company toward. You know, inference is where the payoff is. It's actually where the money is
because inference says that you have a model that's good enough and now you're just deploying,
deploying, you know, you have thousands of agents, tens of thousands of agents, they're all
asking questions and you need to be able to support all that. So I think when you see inference
grow, that actually means AI is having an impact, you know, in the world. And because we see
so many different applications, yes, I think this is this is a place where the battlefield will be
around, you know, how do you really serve inference in the most, you know, efficient, effective?
way.
The pivot to inferencing is something other chip makers are also thinking about, including
Nvidia.
Its CEO, Jensen Huang, has estimated that as much as 90% of the market for computing power
will be for inference computing.
Nvidia has signaled that it welcomes competitors.
It also said on X that it was, quote, a generation ahead of the industry.
AMD isn't the only company trying to break into the AI chip business and cut into
invidia's dominance in the market.
Other chipmakers like Broadcom and Qualcomm have also emerged as competitors, designing their
own AI chips and data center tech.
Silicon Valley giants outside of the chip world are also entering the space.
Google is selling access to data center chips that previously reserved for internal use only,
and Amazon has started selling chips that says are faster and more energy efficient than
NVIDIA's.
There's a lot of entrance in this market.
And so, yeah, this has implications for AMD.
I mean, Lisa Sue is probably sitting there and watching and saying,
hmm, that's interesting.
That's going to make our lives more complicated.
But maybe it does also validate her view that this market is really just so big
that it requires a lot more suppliers.
Do you think AMD will be able to catch up to or even pull ahead of NVIDIA?
I think that's very unlikely.
I think that Invidia's lead is so significant.
But there is a clue in the deal she made with OpenAI,
there was a clue in the fine print of it about one of her goals.
And that was that under this deal, OpenAI only gets the full 10% of AMD's stock
if AMD's stock reaches $600 a share,
which would imply that the full market value of AMD was $1 trillion.
So in other words, she wrote this deal with Open A&A.
in a way that incentivizes the company
to become a trillion-dollar company.
So it sounds like she's sort of saying
that a rising tide lifts all boats,
that there's space for AMD to be worth a trillion dollars,
and Nvidia to be worth $4.5 trillion
because we're just going to need
all this computing power for the AI revolution.
That's exactly what she's saying, yeah.
That's all for today.
Tuesday, December 9th.
The journal is a co-production of Spotify
and the Wall Street Journal.
Additional reporting in this episode
by Amrith Ram Kumar.
Thanks for listening.
See you tomorrow.
