SemiWiki.com - Podcast EP287: Advancing Hardware Security Verification and Assurance with Andreas Kuehlmann
Episode Date: May 16, 2025Dan is joined by Dr. Andreas Kuehlmann, Executive Chairman and CEO at Cycuity. He has spent his career across the fields of semiconductor design, software development, and cybersecurity. Prior to join...ing Cycuity, he helped build a market-leading software security business as head of engineering at Coverity which was acquired… Read More
Transcript
Discussion (0)
Hello, my name is Daniel Nennie, founder of SemiWiki, the open forum for semiconductor
professionals. Welcome to the Semiconductor Insiders podcast series.
My guest today is Dr. Andreas Kuhlmann, executive director and CEO at Psycuity. He has spent his career across the fields of
semiconductor design, software development, and cyber security. Prior to
joining Psycuity, he helped build a market-leading software security
business as head of engineering at CoVarity, which of course was acquired by
Synopsys. He also worked at IBM Research and Cadence Design Systems where he made
influential contributions to hardware verification.
Andreas also served as an adjunct professor at UC Berkeley's Department of Engineering and Computer Science for 14 years.
Welcome to the podcast, Andreas.
Thank you, Daniel. Thank you for having me.
So the first question I'd like to ask is what brought you to semiconductors? Do you have an interesting story you can share?
I certainly have a story, Daniel.
If it's interesting, we'll see.
I grew up actually in former East Germany.
So in the 80s, I was a student of microelectronics
and just electrical engineering.
And in one of the companies where I was an intern,
they reversed engineers' chips from the
best, copied them and manufactured them.
So one of my first assignments was essentially take some of these circuitries, use a equivalent
of spice and try to figure out what they did.
So that was my first job.
It really got me interested into chip design, digital design, and then generally semiconductors.
That's a great story.
And what brought you to Psycuity?
All right.
Yeah, as you mentioned before, I was at Synopsys first through the Coverti acquisition and
then running the software integrity business till 2020.
And then I left Synopsys and there was this little
startup company called Tatoga Logic, now a security,
and that was really doing security in semiconductors.
And that kind of interested me because I have a strong
a semiconductor background,
but now it's also quite some experience in security.
So I thought that seems to be really
a next frontier of security.
So what we do at security is we help organizations
in the pre-silicon design process
to ensure that the chips are actually secure,
that the elements that are using functionality
for security, that they're securely implemented
and you don't have any leakage, you don't have any leakage
You don't have any confidentiality or integrity problems
So that interested me and was ahead of its time and I've always felt I've seen that movie before when I started in software security
You know 10 15 years ago where you know in the beginning everybody just ignored it and then over time it became
You know a very critical element of development. For sure. Why is semiconductor security assurance rapidly
becoming a strategic imperative rather than you know a late-stage checklist
item? Yeah as I mentioned I mean I worked for a long time in software security and
everybody just assumed the chips are secure right they're running it on
whatever you know micropro it on whatever, you know,
microprocessor on whatever platform.
And nobody really thought about, you know,
that chips could actually be directly attacked
on the hardware level.
That has dramatically changed in 2018
with the discovery of Spectra Meltdown,
where it became clear that chips not only can be attacked,
but they can be attacked remotely,
just the same way ARM software is being attacked.
So think about it, suddenly the attack surface
for any electronic system has vastly broadened.
And the problem with chips is also
you cannot just quickly patch it.
I mean, if you're lucky, you can patch the firmware,
but very often you have to either exchange the chips
or you have to disable some functionality.
So since 2018, there has been dramatic increase of not only adding functionality for security
onto chips, but also ensuring that the functionality is actually indeed secure, meaning that nobody
can hack the chips themselves.
And you have also seen over the last year, same movie we have seen in the software world.
This increased number of regulations, standards as in the automotive industry, the ISO 21434
standard that requires that chips are being certified for cybersecurity.
So this movie is just repeating from 15 years ago, the software world is just going much
faster.
Yeah, let's talk about that a little bit more. So what are some of the systemic
barriers you've observed that prevent organizations to get started with
security verification and can you maybe share some best practices to overcome
these? Yeah, right. I think it's very similar what we have seen in the software
world 15 years ago. The first barrier is organizationally.
There's not often a clear ownership of security in an organization identified.
You have it distributed.
There may be a product security team.
There may be a CSO from the software world having some responsibility on the hardware.
Some responsibility leaks in the design team, some responsibility on the hardware, some responsibility leaks
in the design team, some responsibility in the DV teams.
But it's really that it's not clear, you know, where is driving it, who has a budget, who
has a responsibility.
That's number one.
I think number two, what we see over and over again is resources.
I mean, the people that are needed for ensuring
that chips are indeed securely implemented
and the verification side on the specification side,
these people are in high demand.
And you know, I mean, chip designs right now,
people are cranking out a tape house
at a frequency unseen before.
So everybody is oversubscribed.
So finding the resources is the second problem.
And the third one I would really see is setting the company on the roadmap for a security
journey, meaning get going, get going small. Don't boil the ocean right away. Get going
small and move forward. So going to some of the best practices,
it's one of them is what I just said, start small,
but put some kind of virtual organization in place.
What I've seen being very successful
if you have security champions, you know,
in the different design groups,
and you know, having some virtual organization
where you meet regularly, you exchange ideas,
you exchange best practices,
and start driving really a broader effort in the company.
But the second one, what's really important,
is executive support.
I mean, the executive team has to be supporting,
it has to fund it, it has to make sure
that this becomes a priority.
Right, you know, there's been a lot of effort
to shift left functional verification,
getting it early into the chip design process
to be more proactive and more efficient.
When is the right time to start with security verification?
Yeah, very good question, Daniel, right?
We have seen the whole movie in functional verification,
right, we talked for a long time,
shift left, start verifying functionality as early as possible.
The story is exactly the same in security.
The earlier you start, the earlier you can find potential vulnerabilities, the lower
the cost, the higher the probability that you actually fix them.
I mean, it may sound very funny, but I have seen companies when they discover
a vulnerability very late, they let it slip,
don't talk about it anyway, just get going,
we may fix it in the firmware later and so on.
So the earlier, the better is just the same paradigm,
you know, as we have seen in the functional verification.
Right. Can you comment on the role of common weakness enumerations or CWE to help drive systemic
hardware security assurance?
Yeah, the CWE said that's a very interesting one, the common weakness enumeration.
That's actually a community effort supported by MITRE and the U.S US government and was founded in 2006 and
it has really for a long time kind of a little bit of a shadow existence but
then in the software world 2010-2011 it really took off and became a main driver
to articulate categories of software weaknesses that are potential issues for security. It
became really quickly a standard around which customers and developers
can talk about, the ecosystem could talk about, you can report, there's a
formal report which CWEs have you covered in your
verification flow and so on. The CWEs were you covered in your verification flow and so on.
The CWEs were extended to hardware in 2020 and we have no more than 100 CWEs in the hardware
world and I see exactly the same success here.
It's a nomenclature.
It's a way to talk about what matters on a particular chip design.
It's a way to talk about what matters on a particular chip design, it's the way to talk about what
matters to the customers. And it's a checkbox.
It's a very coarse, but it's a coverage metric, how you can say,
hey, did you check this, did you check this CWD, and so on and so on.
So meaningful metrics are key to successfully manage semiconductor designs.
Are there metrics for security verification and how are they applied in practice?
Yeah.
I mean, you're right on, Daniel.
I mean, metrics are critical for any organization that wants to manage success, right?
Metrics are key.
And that's the same in security, right? So when you look at chip designs
and you wanna kinda measure how much have I done
in terms of security coverage,
it really starts with some methodology to look at,
okay, first of all, have I looked at all the secure assets?
Secure assets, think about it as an element on the chip.
It's a data item, a register, a memory range
that carries some secret information. It could be an encryption key,
you know, could be some personal data and so on. Have you enumerated all of them and
have you done for each of them a threat model? Meaning, what do you worry about? Is that a worry that, you know,
a certain asset leaves the chip? Is that a worry about that someone from outside
can override that asset and change the behavior of the chip? So first of all, really systematically
going through your threat model, going through your assets, going through what matters to
you. And then apply security verification with the various technologies that are available, you know,
formal verification technology via security.
We have technologies that's called RADx that is based on information flow.
And, you know, there are alternatives to that.
Now use this technology and have metrics in these technologies.
What we have in RADx, we actually have a coverage metric that tells you how hard
if you try to break essentially,
like a secure element out of the chip.
How hard have you tried to bang against the wall in a way,
what we call the protection boundary,
to try to get a secure encryption key,
for example, out of the chip.
So we have a very crisp coverage
metric that's very much aligned with what we do with in-function verification.
Great. Final question. Looking forward, what shifts do you foresee in the
structure of security programs or methodologies over the next, say, three to
five years? Well, I think, Daniel, think Daniel you and I have been in the
industry for a long time right and we we had the whole history of you know of
sign-offs before manufacturing right and every time you know there was a new
issue you know whether it's timing sign-off, power sign-off, you know
signal integrity sign-off and so on you know there was a new problem in chip
design you know that problem was a new problem in chip design, you know, that problem was identified, technology was delivered and developed to handle it. And then three to four years later, right,
I mean, everybody adopted that and to get to a sign of methodology, I expect exactly the same
here. I expect in four to five years, every chip will have to go through security, the sign up,
if there are security concerns on the chip. That's certainly not
the case for very simple chips and so on. But the majority of the chips right now, there's
some security functionality on the chip and it will have to be shown that they're indeed
securely implemented.
Yeah, I agree completely. Security is critical. So great conversation, Andreas. Thank you
for your time and I look
forward to following the success of Psychuity. Thank you, Daniel. Thank you for
having me. That concludes our podcast. Thank you all for listening and have a
great day.