Storage Developer Conference - #156: Quantum Technology and Storage: Where Do They Meet?
Episode Date: November 2, 2021...
Transcript
Discussion (0)
Hello, everybody. Mark Carlson here, SNEA Technical Council Co-Chair. Welcome to the
SDC Podcast. Every week, the SDC Podcast presents important technical topics to the storage
developer community. Each episode is hand-selected by the SNEA Technical Council from the presentations at our annual Storage
Developer Conference. The link to the slides is available in the show notes at snea.org
slash podcasts. You are listening to STC Podcast, episode 156.
Hi, everyone. My name is Doug Fink. I'm the managing editor of the Quantum Computing Report.
And I'm going to talk to you a little bit about quantum computing and technology and where does it meet with storage.
I want to thank the people at SNEA for inviting me to make this presentation.
I hope you find it interesting. There's been a lot of hype about quantum computing in the industry recently,
and what I wanted to do is give you an idea of where it can be used, and it's both its advantages
and disadvantages, and give you a sense of where I think it might impact the storage industry in
the future. So let's get started. The way I explain quantum computing to non-technical people is I mentioned the fact that when the first computers, vacuum tube based computers, came out in the late 1940s, 1950s,
they were based on physics principles that were discovered in the 18th and the 19th centuries.
Things like Ohm's law, electrons, Boolean algebra, those were all discovered before the year
1900. And it turns out that in the 1920s, sort of a revolution in physics was made with the
discovery of quantum mechanics. And some very famous scientists, Einstein, Heisenberg, Schrodinger, and others,
got together and they came up with a theory of quantum mechanics,
which really revolutionized physics as we know it.
But most of those discoveries were not really leveraged immediately.
Semiconductor technology started to use some of that beginning in the 1950s, but it only used a few of those principles.
And the real secret of quantum computing is that it leverages some of those physics principles that were discovered in the 1920s to come up with more powerful computers that were just starting to utilize and develop today.
So let's talk a little bit about what some of those physics principles are. The one I think that people at SNEA know about is tunneling. This was actually originally
proposed in 1927, and it didn't really become a reality for flash memory and EPROMs until about
45 years later in the 1970s. But this is, of course,
something where you can tunnel through a barrier. It is used in a branch of quantum computing called
quantum annealing, which allows you to tunnel through an energy barrier to find a ground state
and an optimum solution. I won't talk too much about quantum annealing right now. It is sort of
a niche application, but you can follow up with me afterwards if you want to hear more about it.
One that I think you do hear about quite frequently is superposition.
And this is the concept that a qubit, which is a quantum bit, can be both in the zero state and the one state at the same time.
And I show that here with this diagram with these pennies.
Heads would be zero, tails would be one.
But you can have the penny that's sort of spinning on a table,
and it's sort of both zero and one at the same time.
But ultimately, the penny will collapse, and it'll collapse to zero or one.
And qubits actually will do the same thing, which i'll talk about in just a couple more slides another principle that uh quantum computing
leverages is something called entanglement this is the fact we have two qubits two photons uh or
whatever that are sort of linked together and if you do something to one of the qubits, it'll impact the other one. It's called Spooky Action at a Distance by Einstein. they look at this, they believe that entanglement does enable communication faster than the speed
of light. That is not true. You cannot violate the speed of light because random bits are not
equal to useful data. A third thing is called interference. This is actually a diagram of a two-slit experiment
where you can have two waves,
and when the two waves combine,
you can have constructive interference,
and you'll get these bright areas.
If they cancel out each other, you'll get these dark areas.
And that's actually used in quantum algorithms
to help you emphasize the solutions of the ones you want and de-emphasize the solutions of the things that you don't want.
So let's talk a little bit mathematically about what a qubit is.
And the way to think of the way qubits are described mathematically, it's, as I said, a combination of the zero and the one state.
And you have a coefficient here, alpha times the zero state, beta times the one state.
Alpha and beta can be negative numbers or they can even be complex numbers.
And that will allow you to have phases.
Qubits can also have phases as well as magnitudes. But an important
thing to understand is that the probabilities always have to add up to 100%. So you can take
the magnitude of A, the absolute value of A squared plus the absolute value of B squared,
and you get up to one, which would be 100%. If you were to measure the qubit, then what you will find is that it's a
completely non-deterministic measurement. If, for example, you had a qubit that's actually
right at the center there, the equator of this little block sphere, where it's equal magnitudes
of zero and one, and you ran a sampling of, let's say, a thousand times,
you would roughly get a squared percentage of the time you'll get a zero and beta squared
percentage of the time you'll get a one. But it's completely non-deterministic. And actually,
people use this to make quantum random number generators. But qubits are also very, very fragile.
They're very, very sensitive to the environment
and any environmental disturbances
will cause it to collapse again to the zero or one state.
So that's why if you look at a quantum computer,
you'll see these very expensive refrigerators
called dilution refrigerators that will refrigerate it down to temperatures of like 15 millikelvin.
They'll be magnetically shield. They'll have vibration isolation. All of these are to keep the qubits as stable as possible before they decohere.
And even with that, qubits still decohere, like the spinning penny that's
on the table there. And the decoherence times are relatively short. Superconducting computers
would have a decoherence time of, let's say, 20, maybe 150 microseconds. Ion traps are a little
bit longer to the seconds, but they still decohere. So once they decohere, then that's when the magic stops.
So it creates quite a challenge for developers of quantum technology.
Another challenge is something called the no-cloning theorem.
Qubits cannot be copied, at least when they're in a superposition state.
And that can create a lot of problems.
You know, in classical computing, copying is done all the time, but you can't do it in a quantum.
So people have to come up with ways of getting around that. So between the collapse upon
measurement, the fact that you have these short decoherence times, and you have the no cloning
theorem, it really creates big challenges for quantum computing.
And that's why it's such a difficult technology and it's been taking a long time for people to get it to go.
But they are making certainly making progress.
Let's talk a little bit about what happens when you take two qubits and put them together.
This is sort of a chart of what I would call a two qubit register. You have the first section one in the one state, and that will give you the classicals, either a 0, 0, 0, 1, 1, 0, or 1, 1 state.
But if you start taking advantage of some of those other principles one state, and the two together would be a super, this, what I call the zero plus one state, in which case you'll have all
four of those possible states existing simultaneously, 0, 0, 0, 1, 1, 0, 1, 1. I show mathematically
you have a scaling factor, in this case, a square root of one-fourth. And again, the reason for that is because you need to have the probabilities all come up to 100%.
But then you can also utilize not only superposition, but you can utilize entanglement also.
In which case, you can have states like 0, 0, and 00 and 01 that you cannot factor, you cannot reduce that down to the individual states of qubit zero and qubit one.
So mathematically, this is what happens. This is the representation of the entangling state. And, you know, this is something that adds certainly the number of states
you can store in a qubit. And it turns out that the number of states you can store will expand
exponentially with the number of bits. So if you have n qubits, it is comparable in terms of storage of states to two to the n number of bits.
And two to the n can be a very large number very quickly.
It expands exponentially.
If you have 100 qubits, you can actually hold more states than all the hard drives in the world.
And if you have 300 qubits, you can actually have more states than the number
of atoms in the universe. So you can hold a lot of things there, but there is an issue.
Getting the
getting the data out, there's something called Huelvavo's bound which says that you can put the
qubits in but you can't retrieve from n qubits more data than you could get out of n bits
and the reasons for that are complicated but it has to do with the no cloning theorem the
colasphon measurement and the complexity with entanglement.
The way I sort of think about it is think of this giant traffic jam here. And if you try to extract
that one car out of the traffic jam, just no way that you could do it least efficiently.
So who elbows bound is a surprising theorem, but it's true. I sometimes call that the Hotel California theorem. So you
can check out anytime you want, but you can never leave. There is logic in quantum technology,
at least the gate-based machines, you can have quantum gates, somewhat similar to classical gates. In classical gates, you express how they operate with truth
tables. In quantum technology, you express how they operate with matrix. So you can express the
state of the qubits in a vector, and the operation is a matrix. And you do linear algebra, you
multiply the vector by the matrix, and you get a new vector.
And that shows the new state of the qubit.
Some of these gates are similar to things that you might be familiar with in classical computing.
The X gate, for example, is similar to a NOT gate.
It will change a zero to a one or a one to a zero.
A C NOT gate is more similar to an exclusive OR
where you can have where this is the control bit here,
and then this will flip this target qubit
based on whether this is a zero or a one.
And there are other gates that are not similar
to what you'd see in classical computing.
A Hadamard gate, for example,
might take a qubit that starts at state zero
and changes it to the superposition state of zero plus one.
But quantum programs basically are developed
and quantum logic developed
using sequences of these types of gates.
Just to compare the performance of quantum gates versus classical gates,
one thing I would mention is that quantum gates are very, very slow. They're very air-prone,
and they quickly lose the data. So if you just look at things like gate times, for example,
straight-of-the-art CMOS have gate times roughly on the order of 100 picoseconds
or 0.1 nanoseconds.
A superconducting quantum computer will have a gate time of somewhere between
20 to 400 nanoseconds.
And the ion trap machines have gate delays on the order of 200 microseconds.
Very, very, very slow.
And you also have these coherence times. So if you want to do the operations, you have to do all the operations before the qubits decohere. In a superconducting machine, the decoherence time
is between 20 and 150 microseconds.
Ion traps are a little bit better,
but it's still a few seconds.
So ion traps have better decoherence times,
but they have longer gate delays.
And in comparison, the DRAM refresh period is 64 milliseconds.
But of course, if you have static logic,
static logic will stay in its state forever.
The other issue is the fidelity or the accuracy of the gates. Typical fidelities for a quantum
gate, something on the order of 99.5%. That's not very high. And it means that one out of every 200 times, you're likely to get an error in the
gate calculation, and that presents a real problem if you have programs that might want to last
thousands or hundreds of thousands of gates before you get the answer. In comparison,
classical gates just about never fail. You can run them for millions of years and no failure rate.
So, you know, with these issues, why do quantum computers have any potential at all? Let me
explain that to you. One of the misconceptions about quantum computing is that it works by
trying all the possibilities at once and selecting the one that works.
That is not true.
And the reason is the problems I was describing before, the no cloning, the collapse upon
measurement, those types of things.
You cannot trial all things at once.
So quantum algorithm developers have to come up with more sophisticated algorithms to be
able to get the answers they want.
Another misconception is that quantum computing might replace classical computing.
That will never happen. Quantum computing is only good for certain types of problems,
and classical will always be used as an adjunct to classical computing.
Where quantum computing is good, though, are problems that have
low amounts of data, but very highly complex relationships between the data elements.
And the example that I like to use is the traveling salesman problem. Traveling salesman
problem, the complexity, the number of possible paths the salesman can go through is n factorial. So if you
have 100 cities, it'll be 100 factorial, which would take even a supercomputer billions of years
to try to go through all the possibilities, even though the amount of data that you might need for
that program is only a few thousand bytes of data. So there are a number of problems that are like this.
You know, low data, but highly complex interaction.
Besides traveling salesmen, you have a number of logistics problems
where people are actively researching how to use quantum computing.
Chemical simulations, computational chemistry is a very important area area
um one example is the penicillin module a penicillin molecule which only has 41 atoms in it
but those atoms can um have highly complex interaction all the different electrons to be
able to figure out the attraction between the electrons
and the protons and the thing and the virus or whatever you're trying to cure. It can be very,
very complex. So drug discovery is a big area. Battery chemistry is a big area. Anything where
you want to simulate a chemical reaction is what's called an NP hard problem.
And you cannot do it accurately with a classical computer. And that's where people have a lot of
hope for using quantum computing. And there are a number of problems, particularly in industry
that are basically binary optimizations. You can express a problem as some type of binary equation where you have a number of variables that are zero or one.
And then what you do is you just want to figure out the right combination that gives you the lowest value, what's called the ground state for that thing.
Another area, there are big, it is possible to solve big data problems.
Although I was saying in the last slide that you can't retrieve large amounts of data from a computer, from a quantum computer, you can store it.
There are algorithms called data loaders.
So people are looking at using these for quantum machine learning. And there, think about it,
the techniques that people use
would be computational storage.
You can use these data loaders to load in the data
and then you do the operations of the qubits themselves,
the data, directly do the data in the qubits.
And at the very end, you just pull out the answer.
For example, you may want to use quantum machine learning for a training algorithm in a machine learning algorithm.
And you can load in the data. And at the very end, you just pull out the coefficients for the training algorithm.
So the real secret of quantum computing can really be found in this chart here. This is something called an algorithmic complexity chart, which maybe you saw when you studied algorithms in a operations as you increase the elements.
And a good example is the bubble sort, is a sorting algorithm. The first sorting algorithms
was a bubble sort, and that takes roughly about n squared operations in order to sort an unsorted
list. Later on, people developed a more efficient algorithm called a heap sort, which can give you the answer in n log n. algorithms that are more efficient and can scale much, much more rapidly or have a shallower curve
on this algorithmic complexity chart. And probably the most famous example is factoring.
RSA is an encryption technology that would depend on factoring a 2048-bit number, which would take millions and billions of years
for a classic computer. It would essentially be 2 to the n or 2 to the 2048.
There's a very famous algorithm in quantum called the Shor's algorithm, where Peter Shor was able
to figure out an algorithm that could do that in n cubed operations.
And when you're able to factor n to the 20, I'm sorry, 2048 to the cubed is not that large of a number and that becomes manageable.
And people do believe within the next 10 to 20 years, people will be able to have a large enough quantum computer that can actually do that. So the way to think about
this algorithmic complexity and how quantum will compete with classical is that if you have small
problem sizes, small ends, classical will still win out. And that's because of those big advantage
in terms of gate delays. But it'll expand exponentially,
as shown in this purple line here,
whereas a quantum might have a linear algorithm
or something that expands less dramatically.
And at certain crossover size,
you'll start seeing advantage from of quantum
versus classical.
And quantum scientists will talk about algorithms that will have an exponential advantage.
Some will have a polynomial advantage.
But that's where the quantum really, even though the gates are slower, that's where you get the advantages.
So I said that it's a weird form of parallelism um think about a classical logic
operation where you want to run the same operation over several different values you either have to
do those in sequence like this or you have to replicate logic and and you may have lots lots of
of logic but that that gets to be very expensive.
But still, if you double the size of this, then it's going to increase that.
Quantum is able to essentially do – the way quantum algorithms work is they first set all the qubits to this superposition state.
Typically, it's the zero plus one state
where you have equal magnitudes of zero and one.
And then you run the logic operation
and you get all the answers,
all the possible answers out
all at once at the same time.
So for example, if this were an adder,
you could set each of these to the state, and you would get all the possible sums, you know, 0 plus 0, 0 plus 1, 0 plus 2, all the way to 15 plus 15.
And you get all the answers out, but you sample it based on a probability.
If you did this a million times, you get a little chart like this would be a probability.
And the way that quantum algorithms work is they choreograph the logic that they put in the gate sequence that they put in,
choreographs the answers such that the answer that you want starts turning out with a higher probability and the answer that you don't want
will turn out with a lower probability using interference, constructive interference and
destructive interference. And that's the secret of quantum computing, how it can, those algorithms
can give you the answer. Here's an example of something called Grover's algorithm. This is a
search through an unordered list. In a classical computer, there's no way of speeding it up.
If you have to do a search, the worst case is you have to go through all the possible
things before you find the item you want. And that would be n, number of operations.
Grover's algorithm uses quantum, and that can do that in a square root of n.
So that would be what's called a polynomial speedup.
And this is a chart that would show, again, how you start with all the qubits at the state where everything has an equal probability.
And at the end, you end up with a state where the answer has a probability of almost one.
And the answers that you don't want have a probability very close to zero.
So let me talk a little bit now about quantum memories and the quantum internet.
You'll hear a lot more about that.
Of course, the internet is very popular and certainly one of the ways that is used quite heavily is sending bits, photonic bits over a fiber optic cable.
But you can also send entangled photons over a fiber optic cable, but you can also send entangled photons over a fiber optic
cable, and that gives you some new capabilities that previously are not available when you do
things classically. The big one is to be able to send keys for encryption that are unbreakable just due to the laws of quantum mechanics. As I said earlier,
the RSA encryption algorithm is vulnerable to breaking by a classical computing
10 to 20 years from now, just due to the fact that quantum computers can factor something.
But if you send an entangled photon over a fiber optic cable,
it cannot be intercepted by someone in the middle,
and it cannot be copied due to the no-cloning theorem.
So this allows you to create a quantum key distribution mechanism
that can replace RSA and not be broken.
Another thing that will also become very important is for quantum computing clusters.
As we all know, classical computing clusters have been a very important technology
over the past 20 years to increase the size of our classical computing capabilities.
That will be very, very important also in quantum computing.
People will set up data centers with multiple quantum computers.
They will be networked with fiber optic cables,
but they will use what we'll call a mini quantum internet to be able to send
entangled qubits from one computer to the other. And that will be an important
technology. However, when you are sending entangled qubits over a fiber optic cable,
there is an issue with long distances. Fiber optic cables have signal loss. Typically, you have to put in a classical photonic repeater roughly about
every 100 kilometers or so. And that works essentially by taking the photon, measuring,
and then repeating it, replicating it. But again, you can't do that directly with a quantum entangled qubit because of the no-cloning theorem.
So people are working to figure out ways of how they can set up a quantum internet and be able to do that over long distances.
And it's a very active area of research.
There are some people using satellites for that, which are possible.
The Chinese have done that, for example, and others, but it's very expensive.
But people have also developed or are developing a technology called a quantum repeater, which takes advantage of quantum memories to be able to send entanglement from one spot to another spot that's far away.
And I'll show you how that works.
You will have this quantum repeater. If you, let's say, for example,
you wanted to send entangled photons,
two entangled photons or qubits from Alice to Bob here.
And it's twice the distance of what you can do without a repeater. You put a
repeater in the middle here, and then you can set up a couple of entangled sources. This first source
will create entanglements A and B. Send A to the Alice, send B to the repeater. You have a second one that will send the photon from C to the
repeater, D to Bob. And what the entanglement swapping unit will do is it'll swap the entanglement
such that at the end, A gets entangled with D so that both Alice and Bob have the same entanglement.
And then when they measure them,
they will get the same random bit stream out.
And that allows you to repeat it
without violating any of the laws of quantum mechanics.
So where a quantum memory is used
is to basically to synchronize the signals.
You never have the exact same links here.
So what you want to do is when the photon comes in, B, when it comes in from C, you store those in a quantum memory so you can synchronize them so that they're actually sent to the entanglement swapping unit at exactly the same time. This is an important technology
that will be used for the quantum internet so that people can develop these things over long
distances without using satellites. And it turns out that this is an active area of research.
One of the very, very first commercial units is being shipped this month.
There's a company in Brooklyn.
It's a startup company.
It's called Connect that is an offshoot of research at Stony Brook University in New York that is developing these units.
You can see sort of a diagram of the unit here. It uses a technology called electromagnetically induced transparency. And it doesn't
save the qubit for very long. Their prototype unit, the beta unit that they're shipping right now,
just has a coherence time of 100 microseconds. And it has a fidelity of only 95%.
But for the application that I just described a second ago, that's adequate.
And you can use this to develop a quantum repeater.
They are, of course, developing more sophisticated versions with better fidelities and longer coherence times.
But you will start seeing
more of this. It's an active area of research, quantum memories, at both this company and at
various university research centers. So let me turn a little bit to how quantum technology will
be used with classical and how it will impact the storage industry.
Quantum processors will always be used as an adjunct to classical processors.
Classical processors will still be very, very important for a lot of housekeeping functions,
maintaining a job queue, compiling programs, storing results, maintaining calibration data those types of things
um but it's also used for a class of algorithms called hybrid classical computing algorithms
and the way those work is you develop algorithms where a quantum computer may do some of the
processing send its results to the classical processor,
it'll do some processing, send that to the quantum processor. And you could do this iteratively over
thousands of times before you get an answer. But this is a technique that people are using
in order to get around the short coherence times I was talking about before.
And there was several algorithms that people developed that use this approach.
And this will be important.
So classical processors and quantum processors will always work together.
Where storage will be used, storage will always be attached to the classical processor.
No one's never going to hook up an SSD or an HDD to the quantum processor. It'll always be attached to the classical processor. And you'll be using pretty
much the same form factors, the same technologies that you have today. Most of these implementations
will be over the cloud. Today, Microsoft, Amazon, IBM, Google all have quantum processors attached to the cloud.
The unit numbers are still going to be relatively low over the next five, 10 years. It's going to be in the thousands.
To give you a sense of the market size, people are forecasting that by the middle of this decade, quantum computing will represent about a billion dollar market.
That includes hardware, software and services.
Whereas something like high performance computing is more like 50 billion dollars.
And cloud computing is something more like 500 to a trillion dollars worth of revenue.
So quantum computing is still going to be relatively low.
It'll only be used for certain specialized applications.
It's still in its very, very early stages.
You will start seeing some usage of it
for commercial usage of it over the next several years,
but it's sort of like the 1950s were for classical computing.
But for the storage industry, my assessment is that it could still have a very significant impact
to people in the storage industry to use quantum computing for some of your internal operations. Things where I think the storage
industry can utilize quantum computing is, first of all, for new material discovery.
If you want to come up with a new type of material for a hard drive or a flash cell or maybe a new
chemical process to manufacture these things, you can use the quantum computer to do a simulation
of the chemistry and be able to figure out the optimum material, the optimum process
to do what you want to do. The second is in logistics, in manufacturing. If you have a
large manufacturing facility, an example I like to use is a wafer fab, where you may have
hundreds of different stations, hundreds of different process steps, hundreds of jobs.
How do you optimize that to get the most efficiency? So you can set that up as an
optimization problem in a quantum computer, and you may be able to get a better solution than what
you can do with a classical computing. Certainly, the same thing can happen with distribution. If you have material,
how do you distribute it? What are the models for doing that?
Some of you may actually start to be using classical AI in your companies right now,
possibly for forecasting or detecting customer patterns or for something else.
And quantum computing will be able to accelerate some of those AI problems. Using quantum machine
learning, you'll be able to do larger types of artificial intelligence with larger sets of data,
and that may be able to help you in your internal operations.
So I think it could have a significant impact, but really more as a user.
I don't think it's going to be, at least for the next five, 10 years,
not going to be a market where you're going to be able to develop a specific product that will directly attach to a quantum computer.
So with that, I do wanna thank everyone
for listening to this.
If you wanna learn more, I have a website,
it's called the Quantum Computing Report.
You can see the URL here.
You can also send me follow-up questions.
This is my email here.
And I do have a newsletter that I send out once a week,
just the latest news of the quantum computing,
and you can sign up for it on our homepage here. So with that, I want to thank you.
Thanks for listening. If you have questions about the material presented in this podcast,
be sure and join our developers mailing list by sending an email to developers-subscribe at snea.org.
Here you can ask questions and discuss this topic further with your peers in the storage developer community.
For additional information about the Storage Developer Conference, visit www.storagedeveloper.org.