Motley Fool Money - The Silicon, Software, and Systems
Episode Date: August 19, 2024AMD’s latest acquisition is about building out an ecosystem and doing what it can to offer customers more in the AI race. (00:21) Asit Sharma and Dylan Lewis discuss: - Why AMD is spending $4.9B... on ZT Systems, and what the company’s rack-scale ambitions look like. - General Motors’ plans to lay off over 1,000 employees, and why it might be AI-driven. - The questions that company leadership and boards should be asking as they think about AI, and two companies that have established good AI practices so far. Articles mentioned on the show: WSJ piece: Why AI Risks Are Keeping Board Members Up at Night Salesforce’s Generative AI Guidelines Companies discussed: AMD, NVDA, GM, CRM, NOW Host: Dylan Lewis Guests: Asit Sharma Engineers: Dan Boyd Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
This episode is brought to you by Indeed.
Stop waiting around for the perfect candidate.
Instead, use Indeed sponsored jobs to find the right people with the right skills fast.
It's a simple way to make sure your listing is the first candidate C.
According to Indeed data, sponsor jobs have four times more applicants than non-sponsored jobs.
So go build your dream team today with Indeed.
Get a $75 sponsor job credit at Indeed.com slash podcast.
Terms and conditions apply.
Antony in the AI race, Motley Full Money starts now.
I'm Dylan Lewis, and I'm joined over the airwaves by Motley Fool analyst, Asset Charma.
Asset, thanks for joining me.
Dylan, thanks for having me back.
This marks three Mondays in a row.
This has been my shot in the arm every Monday to make it through the rest of the day.
I appreciate you coming with me on the Monday show.
You know, we're catching up on everything that happens over the weekend.
Sometimes, you know, we wind up with the NewsFerry delivering something interesting for us to talk about.
Definitely the case today.
news out that AMD is acquiring ZT systems.
And this is just the continued evolution of chipmakers upping the ante in the AI race.
Reportedly, the deal was for about $4.9 billion.
I'm going to read you, Osset, the poll quote from AMD's press release.
This is a strategic acquisition to provide AMD with industry-leading systems expertise
to accelerate the deployment of optimized racks,
scale solutions addressing a $400 billion data center AI accelerator opportunity in 2027.
Help me unpack that. What are we talking about here with this deal?
Okay. So let's rewind the clock to several quarters ago when Jensen Huang, CEO of
NVIDIA, said that his company foresaw a need to replace these modern data centers,
AI data centers, which was a very new term at the time, every five years. What he was really
saying is that the technology is going to change very fast. And, Nvidia just didn't want to provide
chips anymore. They wanted to really show companies how to build this modern data center. They
wanted to sell some of the networking equipment. They wanted to sell the solutions. And they've largely
done that. They're looked on as very sort of consultative partner with the big cloud hyperscalers,
with enterprise businesses, and they bring a lot of tech to the table. We tend to get focused
on the GPU fight between Nvidia and everyone else's investors. But in terms of
In terms of being able to help a company decide what to put in its data center, how to do
it, Nvidia is still top notch.
In fact, they have their own R&D data center.
They can configure anything on the fly.
It doesn't matter what kind of existing systems you have.
So what you just read is the answer from AMD to all that.
AMD is saying, look, we want to play at every level.
We just saw them spend, what, 600 odd million a few weeks ago to acquire a company called
silo, which is the largest AI lab in Europe.
And here's where they're saying, we're going to bring a ton of expertise to the data center
as well.
It's not just going to be about our accelerators versus yours.
What ZT systems does is gives, and you named it in this quote, rack level solutions.
That means they have a rack that has servers on it.
It's got a liquid cooling system.
It's got networking equipment.
It has power distribution.
It's got software.
And it's customized.
So you roll this rack in and you've got sort of a plug-in-play solution for a data center customer.
And guess who really loves ZT solutions?
It's Nvidia.
They've been great at helping Nvidia customize products and sell them faster.
So this is a way to level the playing field between these two giants.
We're now in round three of this heavyweight fight, early days still, but the blows are starting to land.
I feel like this is probably like 15, maybe 25 round fight.
I think they're going to be duking this one out for quite some time.
And it's interesting that Nvidia is reportedly a customer of ZT systems because I wonder how
that dynamic will play out.
With it now being a part of AMD's portfolio, the deal is not supposed to close until
some point in 2025.
And so there's going to be some lead time for figuring this out.
The other thing that I see emphasized quite a bit in the coverage of this so far,
is this notion of an ecosystem with AMD and what they're able to offer.
Is that getting at that exact thing you were talking about,
avoiding the reliance on just the GPUs
and starting to build out these more holistic solutions
to maybe make it a little bit harder for some of their customers to leave?
I think so.
And from the customer's perspective,
they always want an integrated solution.
That solution takes place over three big areas.
So you think about the terminology that Nvidia and AMD love to sling around.
Here's the way that AMD looks at it.
It's the silicon, the software, and the systems.
That's what they put in the same press release that you're reading from.
So silicon is like, okay, I know I've got to run my large language models on some chips,
and I want them to be fast.
And then the software, you also want to have software that enables you to,
that helps you achieve your business objectives faster and with maybe an edge that you can have
over competitors. You want very flexible software and fast software. And systems, you don't want to
have to reinvest after three years. Going back to less about what Jensen Huang is talking about
today, where they want to have a new generation of systems and hardware every year,
but back to that original vision that a data center ought to be able to keep its competitors,
for its customers every five years and replace it that way. I think AMD is speaking more to
this whole integration where if you are this big, sophisticated business, you don't have
to have various parts of your IT department coming back and saying, oh yeah, the silicon's great,
but the software, AMD, come on, the software. They've been working on their open source
software and the AI, part of that ecosystem as well with this acquisition that I mentioned
from a few weeks ago. This is sort of the systems piece.
And I do think that it makes sense, both to compete with Nvidia, but also just to be able to
come to a hyperscaler, like an Amazon or Microsoft, or a Fortune 100 business, and say,
this is going to cost you a ton of money, but it's going to save you money over the long term.
We've heard Jensen Wong making this very argument.
Before this year, AMD wasn't able to make it.
With these last couple of acquisitions, it's starting to speak that same language about cost,
an opportunity and return on investment.
So I mentioned that sticker price of $4.9 billion.
That is going to be a combination of cash, but also, I think, some stock in the deal.
And if you're simply going as the crow flies on market cap, does not look like a
particularly large acquisition for AMD.
It's currently roughly a $250 billion company.
But if you take the look at the balance sheet and you're kind of looking at it from that perspective,
AMD's sitting on about $5 billion in cash and a question.
As you mentioned before, Osset, they had made another acquisition fairly recently, another
much smaller one.
But I look and I say, you know, this is maybe a little bit of a bigger bet than the market
cap would imply?
I think so, Dylan.
And the point for AMD here is to allocate their funds very wisely.
You don't want to make an acquisition decision that becomes a financial decision.
In other words, you get some return on it.
That's a financial return, but it doesn't add value.
You want it to be, as they see.
said, strategic, if you're going to use up your balance sheet, you want to have the ability
with whatever asset you acquire. It could be a hard asset. It could be a series of contracts
that you acquire. It could be a company. You want to be able to plug that into your system and get
a lot more out of that. And at this sort of critical time for AMD, where it already has a lead
over other multifaceted chip making companies and is playing second fiddle to Nvidia, if you're going to
use up that balance sheet and start to get to a position where future investment might be levered.
This is probably one that makes sense for you because the systems work very well.
AmD's chips versus what they're acquiring with ZT plus the AI piece that they've acquired.
All this flows through from one end to the other.
And I don't think it's wrong to call it like an ecosystem play.
Whenever I hear words like that, like platform, ecosystem,
with this acquisition. We are the ecosystem player in our industry. I always get skeptical,
but here it's logical. I love that you're pointing out that they're starting to fill up,
and they may have to re-up in the future with some debt or more capital from the equity markets
or use some more of their free cash flow as they go along.
And to be clear, the market is rewarding them for that investment today. Shares up about 2% on the news.
They weren't too disappointed.
I think the market in general is rewarding that AI investment right now, which, if I'm
being honest, I made it a little surprising that when we look over at some other news from
today, General Motors laying off more than 1,000 employees in its software and services
division.
And this seems like a move driven by management's desire to slim down operations a little bit.
I'm not surprised by that.
I am surprised that they are targeting the software side of this business.
with slimming things down.
True. And you and I were chatting, Dylan.
Neither one of us is reading too much into this.
But it is interesting.
We had a wave of cost optimization last year where we saw lots of tech companies
laying off a ton of employees.
And we saw in the business world as well, there was so much of staff rejection as interest
rates stayed high, inflation was high, and companies were trying to make sure they could
still improve profits.
That's tapered off a little bit.
And so I just sort of wonder here, we know GM has been up against a lot of flux in the industry.
EVs were hot. They're cooling off as a business prospect for these companies. They're still investing.
Is it really related to that, just sort of the shifting winds of EVs versus their traditional engines?
Or is it something that I think we might see from other companies in the future, which is to say,
these LLMs have become so good at coding and so good at giving architectural advice.
software, if you've got an objective and you describe it to a good large language model,
do we really need hundreds and hundreds of people coding and writing software and trying to
architect this stuff? Or could we take the best of the bunch and then some lower level
employees to sort of check behind some software experts to check behind this interplay between
humans and the AI models? I wonder if that's not going on here. But it's a data point.
You and I will follow this, I'm sure, as the months we're on.
Absolutely, yeah.
And I mean, we've been waiting for a while to see what the shakeout would be as artificial intelligence winds up working its way in more.
And I think, honestly, a lot of my thinking was that these highly technical tech jobs would probably be something that would be assisted.
And maybe we would see a little bit less new hiring going in, but maybe maintaining certain employment levels.
So I was a little surprised to see them ratcheting this down.
But it does remind me a little bit of a story I saw last week that I've been kind of sitting on, Wayne, just wanted to get someone's take on this.
And this kind of gives us this conversation a nice opportunity to do that.
Wall Street Journal had an opinion piece out last week, why AI risks are keeping board members up at night,
kind of detailing the cost reduction elements that are popping up for companies as we're starting to get more use of LLMs,
but also things like data privacy, things like the information.
employees are sharing with the LLMs.
And what I liked about this piece is we've seen various forms of AI mongering,
you know, some of it being very good, some of it being very bad.
But this was one of the first times I've seen a board-level perspective on this,
where we're starting to think a little bit more about corporate liability.
We're starting to think a little bit more about the way that we're structuring policies for workers.
That's what a lot of the piece got into.
On that note, I mean, what do you want to be seeing from boards or from,
management teams when it comes to this stuff.
Dylan, I want to see boards really hone in on a few things.
One is to understand what the ethical implications of AI are.
Number two is to understand where things are proprietary and could be exposed.
As you mentioned, that's a growing concern.
And three, I want them to get more involved on a very minute level.
What I mean by that is it's been easy in the past to have board members who,
who were experts in their field.
So if you had to do your audit compliance,
your audit committee was filled with people
who had this kind of experience.
Same goes for companies who want to maybe expand into markets.
If you're a tech company, you bring in a board member
who has great experience and go-to-market function
in the industry where you want to tap into.
That all makes sense.
But we need people who are very hands-on with AI.
And I think the Wall Street Journal article
that you referenced speaks to this a little bit. You can't really understand the effects of
this technology until you've played around with it. So if I were a CEO, I would be reluctant
to take advice from any board member who can't really show me that they are playing around
with AI and have more than a beginning level knowledge. That is, in this day and age, anyone can
pop a question to chat GPT or Claude Anthropic, right? So show me that you understand the systems.
And the reason this is important is because then you can get some guidance on how so many
other decisions can be made. Do we spend the money to just keep an LLM in-house and draw a circle
around it so stuff doesn't get out? Are we comfortable with a third-party provider who's saying
they've got a secure port and no one can ever get to our data? Do we understand if employees
who are playing with this stuff themselves and developing great software for our company are
following our rules. Maybe you have non-compete agreements in place. Maybe you need them if you don't
have those for some smaller companies that have boards. The company doesn't have to be public to have
a board of directors to advise it. So there's so many issues related to both how this can hurt
companies in the long run, but also just the potential that needs some expertise. So this is a
challenge and it could be that we'll see boards creating positions that bring in just an AI expert.
to advise on these matters.
One of the things I thought that was kind of interesting that came up the piece, and we'll
link to it in the show notes, because I think it's a good read, was that you know, you have
some companies, they name check Salesforce, publicly posting their guidelines for using and
developing things like generative AI, I think in part to create a rules of the road for their
employees.
But I think also there are a lot of customers that use Salesforce.
CRM and some of their software suites.
And there are going to be questions about what the process is for the software that you
are consuming and then feeding your own data into on your side, because any privacy
or data elements kind of transfer.
There's a little bit of a transitive property issue here.
I thought it was a unique approach and one that I'm kind of looking for more companies
to be doing because I think we need that transparency.
I think that's such a nice point.
I think about companies like Salesforce and Service Now, which has been forthcoming in how they use data and how companies maybe can perceive them as a partner.
There is a lot of publication of guidelines and thought and how they're trying to protect assets.
But it's still like early Wild West here.
If you are a publicly traded company, the government's going to require you to have audited financial statements.
So investors can understand that you're not fudging the numbers.
We just refer to AMD's balance sheet, Dylan.
That's because there are rules and regulations that force them to put that stuff on paper
so you and I can look at it.
But the government really hasn't stepped in much to do anything about regulating the way
companies interact with generative AI.
Now, that's for the good and the bad.
Here in the US, we traditionally are an entrepreneurial society where we have good regulation.
It comes in stages as stuff evolves.
In Europe, it's a little bit of the opposite situation.
They're very quick to put in guardrails.
Sometimes that kills the investment and the creativity.
Not to say that Europe is any less creative than the U.S.
It just can be harder sometimes to bring new technologies to market.
And I'm somewhere in the middle.
I sort of want to make sure that we as a society keep our entrepreneurial edge and move with this stuff, go forward.
But I do want to see that along the way we're thoughtfully approaching the technology.
And sometimes, as you point out, companies take the lead in that.
then the government follows. It looks at what those thought leaders have put in place. And I will
guarantee you that the people who are employed to work on regulation, study those as best practices
and interact with them. So there is something circular here, and a few companies are taking a leave.
Awesome, Sharma. Thanks for joining me today. Maybe, just maybe. We'll do it again and make it four Mondays in a row
next week. I'm up for it. Take it easy and great to chat with it.
Okay, just a quick programming note, no second segment today or for the next few days.
The Motleyful Money team is on site at Podcast Movement here in Washington, D.C.
And we'll be bringing you some special conversations on podcasting and the ad industry from that.
And hey, if you're at Podcast Movement, give us a shout, let us know.
You can reach us at Podcasts at Pool.com.
As always, people on the program may own stocks mentioned,
and the Motley Fool may have formal recommendations for or against.
So don't buy us something based solely on what you hear.
I'm Dylan Lewis. Thanks for listening. We'll be back tomorrow.
