CyberWire Daily - Quantum-proof and ready: NIST unveils the future of encryption. [Special Edition]
Episode Date: August 25, 2024In this Special Edition podcast, N2K's Executive Editor Brandon Karpf speaks with Dustin Moody, mathematician at NIST, about their first 3 recently finalized post-quantum encryption standards. NIST f...inalized a key set of encryption algorithms designed to protect against future cyberattacks from quantum computers, which operate in fundamentally different ways from traditional computers. Listen as Brandon and Dustin discuss these algorithms and how quantum computing will change the way we view encryption and cyber attacks in the future. Resources: NIST Releases First 3 Finalized Post-Quantum Encryption Standards (NIST) FIPS 203 FIPS 204 FIPS 205 What is Post Quantum Cryptography? (NIST) National Cybersecurity Center of Excellence (NCCoE) Post-Quantum Cryptography Standardization Project (NIST) Need to know: NIST finalizes post-quantum encryption standards essential for cybersecurity. (N2K CyberWire) Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the Cyber Wire Network, powered by N2K.
Air Transat presents two friends traveling in Europe for the first time and feeling some pretty big emotions.
This coffee is so good. How do they make it so rich and tasty?
Those paintings we saw today weren't prints. They were the actual paintings.
I have never seen tomatoes like this.
How are they so red?
With flight deals starting at just $589,
it's time for you to see what Europe has to offer.
Don't worry.
You can handle it.
Visit airtransat.com for details.
Conditions apply.
AirTransat.
Travel moves us.
Hey, everybody.
Dave here.
Have you ever wondered where your personal information is lurking online?
Like many of you, I was concerned about my data being sold by data brokers.
So I decided to try Delete.me.
I have to say, Delete.me is a game changer.
Within days of signing up, they started removing my personal information from hundreds of data brokers.
I finally have peace of mind knowing my data privacy is protected.
Delete.me's team does all the work for you with detailed reports so you know exactly what's been done.
Take control of your data and keep your private life private by signing up for Delete.me.
Now at a special discount for our listeners.
private by signing up for Delete Me. Now at a special discount for our listeners,
today get 20% off your Delete Me plan when you go to joindeleteme.com slash n2k and use promo code n2k at checkout. The only way to get 20% off is to go to joindeleteme.com slash n2k and enter code
n2k at checkout. That's joindeleteme.com slash n2k code n2k. And now, a message from our sponsor, Zscaler, the leader in cloud security.
Enterprises have spent billions of dollars on firewalls and VPNs, Thank you. that are exploited by bad actors more easily than ever with AI tools.
It's time to rethink your security.
Zscaler Zero Trust Plus AI stops attackers by hiding your attack surface,
making apps and IPs invisible, eliminating lateral movement,
connecting users only to specific apps, not the entire network,
continuously verifying every request based on identity and context, Thank you. organization with Zscaler, Zero Trust, and AI. Learn more at zscaler.com slash security. Thanks for joining us for this CyberWire special edition.
Dustin Moody is a mathematician at NIST,
and recently N2K's Brandon Karpf sat down with Dustin
to discuss the first three finalized post-quantum encryption standards.
Here's their conversation.
I'm joined today by Dustin Moody, Supervisory Mathematician at the National Institute of Standards and Technologies. And Dustin's here today to fill us in on the recent standards
released by NIST around post-quantum cryptography. Dustin, great to have you on the show.
Really excited to have this conversation.
Great, happy to be here.
So could you fill us in just on the background of NIST's project for post-quantum cryptography
and where we've gotten to today and ultimately what the goal is for the program?
Yeah, certainly.
So since the 1990s, cryptographers and others have been aware that if a large scale quantum computer could be built, it would break some of the crypto systems that we rely on to protect our information.
imagined. They weren't realities. But since then, different companies and organizations have been working on building them because they would bring a lot of positive benefits to society. They could
do a lot of things that our current computing technology cannot. At NIST, our particular group
deals with cryptography, and we approve the algorithms that the federal government uses to protect all of our information.
So we were aware of this. Probably around 10 years ago, we started scaling up our project a little
bit because we saw the progress in quantum computers was growing and that they were
becoming larger. They're not large enough to threaten current cryptographic levels,
but we need to get these standards in place well in advance of that.
So at NIST, we started building our team and our expertise,
and we eventually decided that the best way to create new standards
to get new crypto systems in place
would be to do a large international competition-like process
to select algorithms that we would evaluate internally and the cryptographic community could also evaluate.
And this has done this sort of thing in the past, and it has gained a lot of acceptance, a lot of credibility,
because people can trust the algorithms that come out of this because they've been so well studied.
So we announced that back in 2016 that
we would be doing this. In response, we received a large number of submissions. We had a total of 82
that were sent in to us from different teams around the world who had all designed the best
algorithms that they could come up with to provide protection. Over the past eight years or so,
come up with to provide protection. Over the past eight years or so, we've gone through a series of evaluation and analysis. Internally, both we've looked at them, we've implemented them, checked
out their performance benchmarks. And similarly, people around the world have been doing the same
thing. Some of them were broken along the way. That's what happens. The strongest ones survive,
and we have more confidence because they've been studied so carefully.
So after a series of three rounds back in July of 2022, we announced the four algorithms that we would be standardizing as a result of this process.
Since that time, it took us a year or two to write up the standards for those algorithms. But that's where we are now.
Great. So could you walk us through those standards? There was, just in the last few weeks,
three that were officially released, and it sounds like a four might be on its way.
Could you walk us through these?
Yeah. So we were looking for two different cryptographic functionalities,
one of which is to do key establishment, or you can equivalently do encryption.
And another is to do what's called digital signatures, which are used to provide authentication online.
We selected a few algorithms for each category.
For digital signatures, we selected an algorithm called Crystals Dilithium. It's the main algorithm
that we expect people to use. It's based on something called lattices. We can get into all
the math if you really wanted to, but most people are just happy to know that it's on something
called lattices. We also selected two other algorithms. Another algorithm based on lattices
that's called Falcon. It has smaller key sizes
than dilithium, but its implementation is a lot more complex. You have to use floating point
arithmetic and many devices might struggle to securely implement it. So it's available for
certain applications that really need those shorter signatures, but most applications will
be able to use dilithium just fine. The third signature that we selected is called Sphinx Plus.
It's based on a different idea than lattices.
The idea there is to have a backup
in case there's some attack or some vulnerability discovered.
We have something not based on lattices.
It is around for that purpose.
However, it's a bit slower and bigger,
so it wouldn't work in many applications.
But if security were your number one concern,
security analysis is a bit more conservative.
So for some users, it might be their choice.
So those were the three signatures.
We selected an algorithm called Crystals Kyber for key establishment. It was also based on lattices. Over the course of the past eight years, lattices turned out to be the most promising area for post-quantum algorithms. It has great performance, has great security. So it was selected.
So it was selected. Now, three of those were standardized. That's Kyber, Dilithium, and Sphinx Plus. They came out in documents that we call FIPS, Federal Information Processing Standards. And they first went out for public comment. We had a draft form. We got some feedback, made a few small changes. And then just a week ago, we published them in their final form so that people can begin to use them. The fourth algorithm, Falcon, that was selected, we're still writing
the standard. It's not yet done. We wanted to focus on dilithium first because it's the primary
signature we want people to use. And because of the complex implementation, it's just taken us a
little bit longer to write the standard,
and we hope to have it out by the end of 2024. Great. Well, curious here, because you've mentioned a couple different techniques. And you mentioned Lattice, and it seems like three of the four
are based on Lattice techniques. What about that approach to cryptography makes it
inherently more secure against quantum-based
attacks? Yeah, so that was over the past several years. Well, people have been studying that exact
question you asked about, how can we protect against quantum computers? And the crypto systems
we use today, turns out they're all based on hard mathematical problems that it is difficult for a computer to solve. So an algorithm that we use today is known as RSA. Its security relies on the fact that if you have
a really, really large number, it's hard to break it down into its prime factors.
So for quantum computers, what we needed to find were algorithms that were based on hard problems
that quantum computers are not known to be able to solve any faster than classical computers.
And mathematicians in computer science have studied that for a few decades and found a few different promising areas.
Lattices were one.
There's a couple of hard problems associated with lattices, known as the shortest vector problem or learning with errors.
Another family of algorithms are based on what's called error-correcting codes.
A third is based on multivariate algebra.
And for each of these, it's hard to describe exactly why a quantum computer
doesn't seem to be able to break them,
other than very smart scientists who understand quantum algorithms
have tried their best. They've looked at all the known algorithms, and none of them seem to provide
any avenue of attack that would break these. We have no absolute guarantee, but that's the case
even in current cryptography. It could be the case, you know, some brilliant person comes up
with a new idea that breaks what we're using today. Sure, that's science and that's discovery. That's how that works.
I'm curious about the threat space that we're
talking about here. Obviously, there's been a motivated...
You discussed this a little bit, even dating back to the 90s. Theoretically,
we were thinking about the possibility of quantum computations
being able to do things like factor large numbers into their primes
faster than traditional computers.
And you just discussed with lattice-based techniques
the ability to make that a hard problem, right?
Not something that even a quantum computer could accomplish
or could break in a reasonable amount of time.
Could you talk a little bit about the threats right now, though,
right now or in the near future that we would be facing if we don't move forward implementing these post-quantum or quantum
resistant encryption algorithms? The online world that we live in, we buy things online,
we send emails that have information that should be protected. We've got our medical records.
emails that have information that should be protected.
We've got our medical records.
All of this is behind the scenes protected by cryptography that most people don't think too much about
because it's all taken care of in your browser
or by whatever application you're using to send that information.
But it really is all dependent on secure cryptographic algorithms.
There's many different types of cryptographic algorithms that are used.
We have some that are called public-key algorithms,
some that are called symmetric-key algorithms.
And that thread I was talking about from quantum computers
relates directly to the public-key cryptosystems
which are employed today.
There is an impact to the symmetric-key algorithms
from quantum computers, but it's not as drastic
it won't completely break those algorithms
it'll mean at most we need to use longer key sizes
for those algorithms which we can do
that's much more manageable
although when I think about something like a larger key size
I think about the processing required to implement that
and the time it would take to actually accomplish that type of encryption.
Is that a consideration?
It would be annoying, but it's actually not too big
of a deal. Right now, the main symmetric
key algorithm used to encrypt data
is known as AES.
Many users use a key size
that's 128 bits.
That's fairly small.
To protect against a quantum computer, it's known
that if you went up to AES-256
or just doubling the length of your key,
you could use the same algorithm,
just a longer key.
The performance impact would be pretty negligible
because 128 bits to 256 bits
really isn't that much when you implement AES.
It's on the public key side
that we have more of a problem
because Shor's algorithm would completely break every single's on the public key side that we have more of a problem because Shor's algorithm
would completely break every single one of the public key algorithms that we use today. That
includes RSA, elliptic curve cryptography, Diffie-Hellman. And so it's those algorithms
that we have to replace. Otherwise, we're going to be vulnerable and people will be able to get
access to information that should be
protected if a large-scale quantum computer comes out and you're not using post-quantum algorithms.
Got it. So you mentioned Shor's algorithm. Could you talk us through that?
Yeah. So Peter Shor was a scientist from MIT that back in the 1990s,
he was working on algorithms that would run on a
theoretical quantum computer. They did not exist. And he came up with an algorithm that what it
ultimately does is it finds patterns. It finds a period. The period means the time it takes until
something repeats itself. And he was able to come up with an efficient algorithm that if you had a large-scale quantum computer,
would be able to run in minutes or hours, depending on how big the quantum computer is.
And he noticed that, well, if you can find the periods,
that you could then translate these hard math problems
that I've talked about, factoring, another one known as the discrete log, you can translate
those problems into needing to find the period of certain things. And so when he noticed that,
then his algorithm became a quantum algorithm that would solve integer factorization and would solve the discrete
log problem efficiently, faster than our
current computing technology can do. And that makes it
very disruptive for cryptography.
We'll be right back.
Do you know the status of your compliance controls right now?
Like, right now.
We know that real-time visibility is critical for security,
but when it comes to our GRC programs, we rely on point-in-time checks.
But get this.
More than 8,000 companies like Atlassian and Quora have continuous
visibility into their controls with Vanta. Here's the gist. Vanta brings automation to
evidence collection across 30 frameworks like SOC 2 and ISO 27001. They also centralize key
workflows like policies, access reviews, and reporting,
and helps you get security questionnaires done five times faster with AI.
Now that's a new way to GRC.
Get $1,000 off Vanta when you go to vanta.com slash cyber.
That's vanta.com slash cyber for $1,000 off.
And now, a message from Black Cloak.
Did you know the easiest way for cyber criminals to bypass your company's defenses is by targeting your executives and their families at home.
Black Cloak's award-winning digital executive protection platform secures their personal
devices, home networks, and connected lives. Because when executives are compromised at home,
your company is at risk. In fact, over one-third of new members discover they've already been
breached. Protect your executives and their
families 24-7, 365 with Black Cloak. Learn more at blackcloak.io.
When I talk to folks about quantum and the post-quantum age,
the sense I get is a lot of people even today still think of this as science fiction. They think of the quantum age as being decades and even potentially centuries into the future.
They don't see it as a present threat.
What NIST is telling us and what NIST is working on and investing in this program for now eight years and publishing these standards is that this is something that the government and at the highest levels care about and are investing time and resources in.
Can you walk us through kind of why it's not really science fiction, why it is a threat today, and the ways in which cybersecurity professionals need to be thinking about this technology.
So with regards to science fiction, I mean, there are companies that have built quantum computers that are on the marketplace today and that you can use.
IBM has quantum computers that you can interface with online.
interface with online. Now, these are small quantum computers, but they are quantum computers that do things that classical computers cannot. They are not yet at the size where they are
solving a lot of the problems that would be really beneficial to society, but it's predicted that
within 10 to 15 years, they might be able to get there. So we're still not quite there yet, but there's
been significant progress made over the past decade or two. And we do have small ones.
You asked about the threat. Yeah, the US government is taking the threat of quantum
computers being able to break cryptography very seriously. And let me paint a picture that is a little bit counterintuitive as to why.
This thread is often known as store now, decrypt later,
or harvest now, decrypt later.
And it's the idea that your data today,
you are actually at risk to potentially a foreign adversary
who has a quantum computer,
even though they don't yet have that quantum computer.
And that sounds a little strange, but the scenario is,
imagine you've got all your data now and you've encrypted it, it's protected,
and suppose maybe it's national secrets that need to be secure for 30 years or something, we'll say.
Okay, so you've encrypted it. Great. Suppose an adversary is able
to get a copy of that data. Maybe you stored it in the cloud or they found a vulnerability.
You're not too worried because it's encrypted. They can't read it. But maybe a quantum computer
comes out in 15 years and they're then able to get access to that data that you were hoping was going to be
secure for 30 years. So you're actually already at risk today, even though that computer won't
come out for 15 years. And that just underscores the need to why we need to have these algorithms
and standards in place well before a quantum computer comes out there, so that you can make
sure your information is protected
for as long as it needs to be.
So even with the best guess of 10 to 15 years
before a quantum computer of the right size
and power capabilities can decrypt
our public key cryptology today,
we're already a little bit behind the curve.
Potentially.
Now, when we encrypt our data,
we don't use public key encryption to encrypt it. Typically, we'll use symmetric key encryption.
However, how you created that symmetric key you're using to encrypt the data may have been created
through public key means. So the exact details of whatever application you're using, you have to look into that to see.
But in some circumstances, yeah, you could already be at risk if you have not
sufficiently protected your data, you know, using layers of defense to protect it. That's true.
So for organizations now looking at these standards, these new standards and the new NIST-approved algorithms, what goes
into implementing them? What's going to be the process like? What's going to be the timeline?
What's going to be the requirements for an organization to get their minds wrapped around
successfully implementing these algorithms? So we hope organizations and companies have
been aware of post-quantum for a while now and have been looking into this to understand the threat and to be looking at the algorithms that were going to be coming out so that this wouldn't be a surprise.
We know that migration to these algorithms, it will be tricky, it will be complicated, it will be costly.
In terms of implementation, I think most people won't be implementing this for themselves.
There will be vendors and libraries that implement them, that go through validation and testing to make sure that their products are certified.
And so for a lot of people, it will need to be talking to their vendors to find out if their vendors have implemented the algorithms and if it's passed testing and validation.
have implemented the algorithms and if it's passed testing and validation. Another important step organizations need to be thinking about is they need to find out where they're using cryptography.
That's a tricky question that most people don't know the answer to. And it can require software
tools that will scan your systems and find out where's your data that's being protected, what
algorithms are protecting it.
And because you can't simply migrate to a new algorithm unless you know where you need to,
where is the crypto being used that you need to migrate.
So yeah, there's a lot that goes into it.
We recommend people definitely have somebody
that's kind of, this is on their plate,
that they're leading an effort to think about this and make a plan for their organization. It's going to take time. The
standards have come out. It'll probably be a year or two before we start to see lots of products
that have implemented it and have been tested and are on the marketplace being sold. Even then,
it'll take, we estimate, 10 years, 15 years for the transition to occur.
Because this is one threat.
People that manage information security, there's many threats, there's many things on their plate.
And it takes time to, you can't just flip one switch and this problem is solved.
So we expect the transition will probably take 10, 15 years to occur.
Got it. will probably take 10, 15 years to occur.
Got it.
Well, so in that time then,
what comes next for you and NIST and the program that you're a part of?
So we're trying to help make the migration
as smooth as possible.
NIST has what's called
the National Cybersecurity of Excellence.
It's running a migration to PQC project
where we've partnered up with around 40
to 50 industry partners to help develop tools and to come up with guidance and to learn best
practices to ease the migration. And it's really great to see these companies are working together
to make the migration as smooth as possible. So that's a great resource that people can
turn to for information there.
For me, we're also still working on standardization. So we have these first
algorithms that were selected, but we know that research is going to continue to evolve.
We have a few algorithms that are still in a fourth round of evaluation from our
competition-like process. We will likely select one or two of those to
standardize within a couple of months, and we'll write the standards for them. Those will be to
complement Kiber, the algorithm that the encryption or key establishment algorithm that I talked
about. We had three signatures, but only one of them. And so these will add to those numbers.
but only one of them. And so these will add to those numbers. And then also we have another standardization project going on. I mentioned that Sphinx Plus is a little bit big and slow
for not being based on lattices and many applications might have a hard time using it.
So we called for new submissions for signature algorithms not based on lattices that would be better performing than Sphinx Plus.
And in response, we received 40 different submissions.
So we're in a multi-year process to evaluate all these.
That's a big project.
Yet we might select another one or two to standardize out of this.
So we're going to be dealing with a lot of fun cryptography for the next several years.
Got it. So a lot of research still to be done and evaluating them. And I imagine you're looking at
efficiency and resilience to various attacks. To what extent does the community outside of NIST
participate in that type of validation process? Oh, we rely a lot on the external cryptographic community.
We have a very talented team here at NIST.
There's not enough of us to do all the analysis that would be required to look at.
The original process had over 80 different algorithms submitted.
So we rely on the cryptographic community a lot, and we're very thankful for their efforts.
They do a lot of research.
They do a lot of benchmarking, and they publish their results.
There's conferences dedicated to this.
Industry has people that are also doing benchmarking, and they're putting these in real-world protocols to see how they run, how are timings affected.
see how they run, how our timing is affected. And so it's a great worldwide effort that has been very, very collaborative, and it's been pretty cool to see. And the main things that
we look at when we evaluate, first and foremost, is security. We need these algorithms to be secure
against quantum computers, but also our current classical computers, because those aren't going
away. That'll still be probably the main computers that people have and use. The second criteria is performance. And we're looking at a lot
of different performance benchmarks on a variety of platforms, both servers, cell phones, lightweight
devices. We're looking at things like what are the key sizes? What are the signature sizes? What are
the ciphertext sizes? How is the bandwidth affected?
Many of these algorithms are larger than what we're used to with RSA and elliptic curve cryptography.
And so there could be some challenges for protocols and applications that have to handle the larger keys and larger signature sizes.
So those are some of the things that we look at when we're evaluating these algorithms.
And we do love to hear about big international collaborative projects like this.
So, you know, it's nice when a whole community gets together around a common mission,
especially one of security.
And to give maybe some numbers to it,
we have an online mailing list called the PQC Forum where we have a lot of discussions and questions.
And there's over 3,000 members on that PQC Forum. Many are there just to listen, but there are several hundred
that are active and post questions and answer questions and things like that. So it's definitely
a large community. Love to hear it. Love to see it too. Dustin, well, you mentioned a couple
resources. We will have links to those in the show notes for this episode,
NCCOE and some of the other work that you've mentioned.
Really appreciate you coming on.
Thank you.
Thanks for having this conversation and helping spread the word.
That's Dustin Moody, mathematician at NIST,
speaking with N2K's Brandon Karp.
You can find a link to the newly released standards in our show notes.
Thanks so much for joining us. We'll see you back here next time. Thank you. deliver measurable impact. Secure AI agents connect, prepare, and automate your data workflows,
helping you gain insights, receive alerts, and act with ease through guided apps tailored to your