In The Arena by TechArena - Phoenix Rising: The Future of CPU Innovation with AMD
Episode Date: June 13, 2023TechArena host Allyson Klein chats with AMD's Lynn Comp about the changing landscape of CPU design and how AMD is poised to lead future innovation....
Transcript
Discussion (0)
Welcome to the Tech Arena, featuring authentic discussions between tech's leading innovators
and our host, Allison Klein.
Now, let's step into the arena.
Welcome to the Tech Arena. My name is Allison Klein, and today I'm delighted to be joined by Lynn Komp of AMD.
Lynn, welcome to the program. It's great to have you here.
Thank you. Thank you, Allison. I'm so excited to be here.
Lynn, why don't you go ahead and introduce yourself and talk about your purview at AMD
and how it relates to the topic today?
Absolutely.
So I am a corporate vice president running server technology marketing, and that includes
product as well as the technologies that are next to the CPU and enable some really important workloads and use cases.
And that is inclusive of the Epic CPU business across all of the various markets that would
be using it, whether it's HPC, enterprise, or cloud. You spent a lot of time in the silicon
arena, and I think folks could argue that we're in a golden age of silicon.
What is the difference, excuse me, what is different today about the semiconductor industry and what is being delivered to market than what we were delivering 10 years ago?
That's such a great question, Allison. Ten years ago, we were really starting to believe
that a lot of the important innovation that was happening for data centers in particular
were really going to happen at other parts of the solution stack. And there was a huge consolidation of different architecture types and instruction sets that
was, in a way, signaling that the innovation at that level was pretty mature.
What's really interesting now is that there are significantly more instruction sets and
architectures than there had been.
So we have ARM starting to come into the data center. We've got RISC-V conversations.
And at the same time, you've got a lot of the logic that the CPUs depend on is creating its
own level of bottlenecks. So power controllers or some of the match sets as they're
called in the industry. So silicon has become critical. One of the other dynamics as well
is that silicon and CPUs in particular used to really be focused largely on other technology
solutions. So you've got silicon in laptops or infrastructure or servers. But today,
when you look at something like a car, it has so much software code and a significant number
of silicon devices that are programmable from the brake controlling systems through the onboard
environmental controls and entertainment systems. Basically, this industry
is getting embedded in everything at this point in time. So that's pretty exciting. It's also a
huge responsibility. Now, all of that relates to data center compute. And I think that there's
never been a greater demand for data center compute. We saw that in the pandemic. We see that
in the delivery of digital services for everything.
And all of those devices that you're talking about relying somehow on the data center for
part of their function.
What do you see the biggest challenge to keep up with this demand at this point? Well, what I've seen for every technology adoption wave has been a massive build-out.
Once the key growth area has been uncovered and the commercialization of that domain is clear, it's a race to build out.
And we've seen that over the last 10, 15 years
with the cloud service providers. It's bringing up instances, it's 24-7 follow the sun models.
And the part of the cycle that's right after the build out tends to be the tune and optimize
and make sure that you're getting everything out of those
resources because you can't continue to grow just by building out.
And so you see a lot of conversation around various data centers in specific parts of
the world that are constrained for space and or power delivery.
There's a lot of questions around how do you have cooling for huge data centers?
Beyond that, there's also where you have the data.
So Composable, Infrastructure, Edge, and CXL to me
are all reflecting, trying to figure out the same basic
fundamental problem is, how do I have the data where I need it? And how do I have enough space
for the data that I need? How do I keep the processing that the data is being used for
from stalling? And so I think that there's some real challenges about
where do you need to have your computing, where do you need to have the data that feeds that compute,
as well as optimizing the build out that we've done and getting more out of the assets and
resources that have already been deployed. Now, you just talked a lot about requirements for performance
and performance and capability. I think that one of the things that is on the top of my mind,
and it's something that I saw you talk about at the OCP Prague event last month, was that
there is a greater focus on sustainability. I read somewhere that computing
infrastructure consumes up to 14% of the world's energy supply. Do you think we're at the point
where sustainable compute is as critical as compute performance for customers? For the end customers that are leveraging the services, not quite yet.
I was at an earlier event before the OCP event where I had a company pull me aside and say, we think energy efficiency is a great message and it's really important, but security
is really our concern. And that was a data point of one or a sample size of one. But I do think
that it's emerging. It's not necessarily a first order problem. When years ago, when you and I
worked together a different life, we were both pushing on the thinking around a
technology that benefited OPEX and benefited power bills. And if the people who have to buy
and deploy the technology aren't paying the power bill, it can be very difficult to sell. However, when I look at Microsoft, Google, Amazon, and a lot of their documentation
now talks about sustainable software practices, I think we're just at the beginning of sustainable
computing and it will get more so, but it's not quite at that inflection point yet for it to be like the mainstream thing
equal to compute performance. We were recently on a panel together discussing multi-cloud
architectures. There's been a lot of talk about multi-cloud and I think that many customers have
chosen multi-cloud solutions for a number of reasons.
Do you think that customers are getting their full value out of cloud?
And if not, what does the industry need to do to deliver to get the full value prop to customers?
So that's such an intriguing question. I believe that if the customer has made a very, very deliberate decision about what there? How do you run and secure the workload? And then how do you basically get access
to the results? So there are things that are happening such as on Microsoft Azure, there's use of EDA tools for silicon design.
In fact, AMD is using that for some of our silicon design processes.
So that burst out and the ability to have access to resources really quickly and not
waiting for the hardware schedule, that full value prop, I believe, is why most people go in.
When the industry is, again, focused on solving the right problem with the wrong tool, I guess
would be the right way to put it.
That, I think, is where there could be disappointment with the cloud experience, not necessarily because it's a bad
experience, but because the expectations were just out of, you know, basically they weren't
aligned with what the cloud actually does solve well. Now all this talk about customer demands
across sustainability, performance, multi-cloud capability.
One of the things that is true is that AMD has emerged as the performance leader for
data center CPUs and is well-positioned to address customer demands across all of those
vectors.
Tell me how you're going to use this position to drive continued industry-wide innovation?
Well, first of all, one of the things that we truly believe in is enabling easy innovation.
There's very often three dynamics that go into our customers' customers, having that sense that I have met or exceeded the expectations I have for the solution.
One of them is the elegance of the solution.
The next one is the value or the expense versus value of the solution.
And the third one is, is it easy to integrate? And one of the things that AMD has really been focusing on is making it as simple as possible to take advantage of the VM and have their keys completely obfuscated from their service
provider, whether internal IT or a cloud. And yet you don't have to change the application to use it.
And many security technologies do require you to change higher level in the software stack,
which makes it challenging. It's not hitting that easy button. So a lot of what AMD is focusing on is making it as fast as possible and as easy as possible with recrafting code. They're not having to deal with going back and revalidating. They have the ability to just
move and operate at the speed of business. I can't leave this interview without bringing up AI.
AI has been everywhere. Everybody's talking about generative AI and how it's a transformative technology. And obviously that is rooted in the data center. How should we view AMD's lineup when
it comes to AI? And should we expect more in terms of the portfolio in the future?
Love that question. AMD has such a great portfolio of options for different types of AI solutions.
And one of the things that I think is really interesting about the AI industry as a whole right now is three years ago, there were a lot of people who were chasing and understood well the requirements of recommendation engines, because that recommendation engine
was essentially how their advertising engine operated.
It was basically the gas to that engine continuing to go.
And what's really fascinating is if you look at CHAP GPT, which is a different model and
a different commercialization opportunity, you
have a different set of requirements.
And so I would say that what you should expect from AMD more in the future is back to that
simplifying the ability to get the most out of an AI solution so that your business can
run and you get the insights at the pace of business.
And given the diversity we're already seeing in AI commercialization models, I think that
there is so much yet to happen in terms of the market opportunities for the technologies
bringing these solutions.
We'll be really surprised in 10, 20 years, just like we were really surprised
when we saw the difference of how much personal computing had extended from hobbyists through to
commercialization opportunities. Lynn, one final question for you. I think that we've covered a
lot of ground today. I'm sure we've piqued the interest of our audience to talk to you and your team more about AMD solutions and what you're delivering with the industry.
Where can they find out more about the products that you're delivering in the data center and reach out to your team?
So, first of all, the best place to go for product data is obviously going to be amd.com. However, people can
connect with me and my team on LinkedIn. I have a LinkedIn site that's open, so you can connect
there or email me at lynn.com at amd.com. And I will make sure that you get the right person
for whatever questions you have. Thanks so much for being on today.
It was a real joy.
Thank you so much, Allison.
Thanks for joining the Tech Arena.
Subscribe and engage at our website, thetecharena.net.
All content is copyright by The Tech Arena.