a16z Podcast - a16z Podcast: The Cloud Atlas to Real Quantum Computing
Episode Date: June 30, 2017A funny thing happened on the way to quantum computing: Unlike other major shifts in classic computing before it, it begins -- not ends -- with The Cloud. That's because quantum computers today are mo...re like "physics experiments in a can" that most companies can't use yet -- unless you use software, not just as cloud infrastructure for accessing this computing power commercially but for also building the killer app on top of it. What will that killer app be? With quantum virtual machines and special languages for connecting and trading off classic and quantum computing, companies and developers may be able to help figure that out, not to mention get ahead of this next computing platform (before it surprises them). Ok, sounds great. Only the old rules don't all apply: You have to fundamentally rethink algorithms for quantum computing, just as with previous waves of high-performance computing before it -- from CPU to GPU to TPU and now to QPU. Because as chips evolve, so do algorithms, and vice versa, in an iterative way. But the chicken-egg question of which came first (the algorithm or the specialized hardware for running it?) doesn't matter as much because the answer itself involves herding chickens: "You're trying to get all of these independent processes to run and cooperate with each other to produce an answer and do so in a way that was faster" than the other way before it, observes Jeff Cordova, interim head of software engineering at quantum computing startup Rigetti Computing. "In hindsight, we really care about the statistical model, not watching the entire movie", shares general partner Vijay Pande, based on his own experiences in the world of high-performance computing. In this episode of the a16z Podcast (in conversation with Sonal Chokshi), Cordova and Pande talk all about the realities of engineering -- and using -- the next computing platform beyond scientific research and hardening it into practical, commercial, industrial-scale reality. Luckily, the cloud provides a map to get us there, today.
Transcript
Discussion (0)
Hi, everyone. Welcome to the A6CC podcast. I'm Sonal. And we're here today to talk more about quantum computing. And for those of you that want more of like a primer on what it is and how it works, definitely listen to our other podcast. But you don't have to listen to that other podcast for this one. The goal today is really talk about what it means to actually build something that's so cutting edge. I think that's a buzzword that we throw around so lightly. And what we'd like to do in this podcast is actually really like break that down. And joining us to have that conversation, we have Jeff Cordova, who's the head of software engineering at Rigetti. And then we also have A6Cency General Partner Vijay, who's on the board.
Rodriguez has a long history actually in the world of high performance computing because you used to do fold at home.
Yeah, you know, there's been a long history of advances in computer architecture. You know, the computers that we learned as kids were very straightforward. But then with high performance, massively parallel machines, like folding at home, we couldn't just take our algorithms and convert it. You'd have to really rethink the problem. When you say highly parallel machines, you literally mean like thousands and thousands of computers running in parallel next to each other or not necessarily physically close to each other.
In fact, in your case, it was distributed across other people's downtime on their laptops.
So this is like SETI.
Yeah, I think a SETI came out basically six months before we did.
So this was, for us, it was October of 2000.
The project's now been running for almost 20 years.
So instead of finding alien worlds, you guys are focusing on protein folding.
Yeah, exactly.
And understanding, especially the intersection of what compute could do in biology.
In that case, they're doing calculations for understanding aspects of biology or protein.
And the interesting thing about that, at least my recollection of it, is that no one thought that algorithm would work.
It didn't look anything like the previous algorithms, except that it was also doing some kind of chemistry that was interesting.
And then they deployed it and they got it working and they continued to make it work.
And now it's like the primary way that you can fold proteins.
In some ways they were right that was impossible.
It was impossible to take existing algorithms and just like shove it down to a very different architecture.
You basically had to rethink the problem.
And we went through this again.
When GPUs came out, we actually were some of the first applications on GPUs even before programming languages.
exist on GPUs, we mean graphical processing units like the kind that Nvidia makes and other
companies make that were originally used for the gaming industry, but they're now being used
widely deployed in machine learning. Yeah, lots of... Exactly. Yeah, GPUs have a great flowing point
performance, useful for calculations. And so again, we have to rethink the algorithm now for
massively parallel GPUs. And I think what we're seeing now with quantum computing is, yet again,
a rethinking of the problem. Like, how are we going to take something that's so powerful yet so
different and try to do something really grand with it? One of the key things to understand,
here is what quantum computing is and how it compares to classical computing. VJ was talking about
in the early days of massively parallel machines, there was kind of an expression amongst the
engineers writing software for those, which is it's like trying to herd chickens. You're trying
to get all of these independent processes to run and cooperate with each other to produce an answer
and do so in a way that was faster than just running on a really fast CPU.
What does it matter, by the way, that it was faster than running on a really fast CPU?
Because if you can get good enough results on its alternative, why would you even bother?
Because you can scale the problem up.
Because in theory, you can then add more processors and even scale it further.
And that was the whole promise of parallel computing, which started several decades ago.
And really honestly, with systems like Kuta.
What's Kuda?
Kuda is the NVIDIA's language for doing parallel processing on the GPU.
And there's no language yet, like Kuda for GPUs, to your point.
Well, there actually is.
Quantum Universal Instruction Language.
I love that as Quill.
I love the play on that word.
It kind of brings to mind pen and hand and you're doing stuff.
The way that it is similar to the Kuda language for GPUs is it sows together the way you interface quantum computers with classical computers.
And without that, it might be difficult to actually use near-term quantum computers.
And the reason for that is just because unlike classical computers, which you can kind of run for days and weeks or perhaps years at a time, quantum computers kind of run in bursts of 100 microseconds.
They're not quite stable yet.
Like you don't have the full control of the system.
But they're getting so we can run them for longer periods of time.
that period of time, you can do incredibly interesting and complex calculations that you actually
can't be done, at least theoretically, on classical computers.
But you need a place to store the results and to interrogate the results and to do other
kinds of classical post-processing on the data that you produce out of the quantum computer.
So there needs to be a way to interface the two.
That's where it comes in is hooking together the classical and quantum machine.
We call that classical quantum hybrid computing.
This is now becoming a common paradigm that first there was just a CPU, then C.
CPU plus GPU, CPU plus TPU.
Right.
So central processor unit to graphical processing unit to tensor processing unit.
And now quantum processing unit.
Each one of these things are specialized hardware for a particular task.
And that can do things that the other ones really just can.
But you know, it's interesting because you painted that as a continuum, like CPU to TPU.
And there is some sort of continuum-like effect.
But it feels like when you move into quantum computing, it's actually more discrete than continuous.
Like you're actually doing something very qualitatively different.
Yeah.
It's a completely different type of computation.
GPUs were intended for graphics, but powerful for many things, and now optimized a bit for machine learning.
TPUs have been designed from the ground up for machine learning, and there's interesting pros and cons of each approach.
Common computing is actually different still.
Break down a little bit more for me about how...
I mean, first off, understanding the hardware is probably going to be important part of things.
I mean, like, to code a CPU, okay, no.
To code a GPU, you actually have to understand memory access and things like that reasonably well to have a high-performance algorithm.
So this fact that we have to rethink our algorithms is really nothing new.
The way that people would understand protein folding is that they would run one very long trajectory
and then sort of watch the movie of what happened.
And all of this is inherently stochastic and statistical anyways.
And so once you realize what you really want to be doing is statistical inference,
you can do statistical inference with many shorter trajectories.
Interesting.
And so it became a completely sort of different way to think about the problem.
And in the end, in hindsight, we really care about the statistical model.
not the movie. So many problems have been done the way they've been done because we've had
only access to classical computers. Okay, so but how does this apply then to quantum computing?
So you do have to think about the hardware, but this is maybe a notch above that where you have
to think about the nature of the noise models and other aspects that are more unique to the hardware.
The appealing thing about quantum computing is that it allows you to take advantage of these
mixed states to be able to take to the Q classical operations and do it essentially in one step.
Yeah, one way that we think about it is that nature.
is inherently quantum mechanical.
You look outside and you see the light bouncing off the trees and that's a quantum
mechanical process.
And it turns out it's very difficult to do lots of those simulations on a classical computer
because a classical computer isn't a quantum mechanical thing.
It's deterministic, Boolean logic.
You can kind of think quantum mechanics is probabilistic in nature.
So one of the rethink of the algorithms that you have to do is that we have to rethink how
one constructs algorithms so that the outcome of them is in probabilistic sense what the answer
we're seeking.
And that's a very different way of thinking about anything, frankly.
We're used to more cause and effect in our life, and we see this macroscopic cause and effect
that's deterministic.
It happens the same way all the time.
But these quantum processors actually give you different answers every time you run them.
And engineering, the reality of quantum computing, how does that play out in practice?
Well, one thing that you end up having to do when you write quantum algorithms is you have to run
them a lot of times and then take statistics on the answer to find out what the answer that nature would
give is. And that's a very different way of thinking about computation. It turns out,
at least in computer science, over the last couple decades, that these probabilistic algorithms
and the like have become very important in solving large-scale problems, that you can kind of
sample a large enough space of answers to get a pretty good answer, even though you haven't
looked through the whole space. And it turns out that quantum computers can actually search
commentatorially a huge space for certain kinds of problems and actually find the real
thing that nature would do. And that's just a fascinating concept to think of what you could do
with that. That's just fascinating to me, especially because I think about the history of statistics,
and the whole science was built on this idea of having limited sample size and sample sets. And then
you kind of moved to this world where it changed entire fields. Like when I think of the early days
of natural language processing versus now, where you have huge data sets to actually be able to
learn on versus having to be parsimonious about your calculations and how you go about it. And now
what you're saying, which I think is completely mind-boggling, is you don't even have to go at the
sample set, you can go to the reality of the actual population. It's like end to capital
end. It's pretty neat. Yeah. And so here's the interesting thing is that for a classical computer,
its power goes like two to the end. For a quantum computer goes like two to the queue, which itself
is two to the end. And with the new technologies using silicon qubits, the number of cubits follows
Moore's law. And so in this case, the number of cubits is more akin to like the number of
transistors. And so here the size of the quantum computer would roughly double every year. So it's like
two to the two to the end. And here's why that is going to catch a lot.
people by surprise, which is that what will happen is a quantum computer at first will seem like
it won't be all that useful. It'll be below the number of cubits that you need. Maybe you need
100 cubits to solve the problem and the existing machine only has 64. And so a classic computer
would easily trounce it. But then the next year, the quantum computer has 128 cubits. And suddenly
it handily beats any class of computer that ever existed for that problem. You know what that is?
It's actually just basically something that's extremely difficult for human beings to process
mentally in our own computers lashing around in our heads, which is exponential thinking in
general. Well, and this is hyper exponential, which makes it even harder. And I think what that means is that for different applications, there'll be a different number of qubits that will be that boundary from where the classic machine loses. And because you have Moore's Law kicked in here, it will just be this very sharp change. People will think the computer won't be useful, and all of a sudden it will dominate. So it's like the classic example where it happens very fast, very suddenly, like it's accelerating, basically. I would say the couple decades I've spent Silicon Valley, I've seen the movie a couple times.
of, and only a couple times, because there doesn't happen very often.
And one of the lessons I've learned from that is that it takes a lot of people to build a new market.
It's not possible for one company to build a new market.
And you get innovations from all over the place.
And at some point, you reach that accelerating point, and then the whole ecosystem benefits.
But there's some real fundamental of building blocks when you say that you've seen that movie before, but a few times,
there are some things that have to happen in order for that to become a reality.
And when you think about the history of computing, and I actually think we should be careful about this, too,
because we can't necessarily extrapolate from the history of classical competing,
but that's all we have to go on.
So that said, how do we think is going to play out,
given what we observed before and where we're going next?
One thing to not forget is that people have been working on quantum computing for a couple
decades.
And kind of the remarkable thing is all of that work has reached a point where it's moved
from research into engineering in terms of building the machine.
And that's why we believe that we can build it.
And why IBM and Microsoft and Google and the other players in this ecosystem can
also believe that, and it has to do with the fact that they use the technology that has been
perfected in Silicon Valley and other places over many, many decades. So we know how to,
we know how to make these lots of them, if we get the first one working. When you say we
know how to make it, what is that know-how? It's the know-how of semiconductor manufacturing.
The superconducting circuits we made are using standard semiconductor manufacturing technologies.
What's complicated about them is writing the software to figure out how to make them do what
they're supposed to do. And that's, there's a lot of really interesting physics and mathematics
and computer science that goes into that. Yeah, with that said, though, I think this reminds me
of the early days of computing where just getting your hands on the device is sort of getting
a ticket to sort of something that is really a part of the future. But most kids won't be able
to have a quantum computer in their house. It'll probably be a couple million dollars or something like
that. But with cloud efforts, there's something where this could be such much more broadly available.
What do you mean by the economics won't necessarily have the Moore's Law property of becoming cheaper necessarily.
It will. It's just that these are more akin to maybe the early days of IBM mainframe. And so there will be a few of them in the world at first. But the difference is that in those early days IBM mainframes, you know, a kid in Asia or in the Midwest would not people have access to one.
No. But what is funny is that the analogy works in another way, though, that is similar, which is that was the original cloud in the sense of physically located. Because frankly, we don't really care where the cloud is located. Right. And in the day, we just care about.
sharing and time sharing, those resources.
And in those days, people did actually have check-in and checkout sheets to go use the
mainframe for whatever application there was.
But I think the point that your maker is even more valuable is that when it does go
in so many hands to that kid in Asia, the kid in the Midwest, somewhere else, we are
completely surprised by the applications people come up with.
Because we, the inventors have never been good at predicting what their tools will lead
to of what people can do when you put that ingenuity in people's hands.
Yeah, building cloud access into how you get.
at a quantum computer will quicken the pace at which the killer apps are found. Instead of there
being one, like there was in the early days of the PC and electronic spreadsheet, we might see
a half a dozen of them pop up all of a sudden. I mean, it just feels like very premature to be talking
about cloud computing for quantum computing. When you think about the history of classical
computing and how long it actually took us to get like to an AWS like state. So A, where are we
and B, like, what's your view on where we should go to get there? That's really, that's really
an excellent question. So it turns out that we actually have quantum computers today at work.
It's just that there's software, there's software simulators. We call them the quantum virtual
machine. You can run a quantum virtual machine in software up to about 30 or so cubits. And what that
means is that people can access and practice quantum programming on a quantum simulator before
the hardware's here and get ready for it. That's mind boggling. I mean, I'm glad you brought that up
because we need to hear that.
The number one question I usually get asked is, well, gee, is quantum computing real or science
fiction?
When's it going to be here?
And the answer is it's here now.
But it's in the form of software.
And that will eventually be surpassed with a real piece of hardware, but you can actually
do a real quantum programming on the software.
That's important because I think that wasn't true in the previous world of computing.
Right.
And we had no cloud.
We couldn't provide access to a piece of software to everybody.
Providing cloud access to quantum computing and the quantum simulator in the early days has kind
of two main values.
And that falls into two camps.
There's customers who are interested in, how do you use quantum computing or quantum algorithms to solve a problem that I have?
And then on the other side, we have this entire community of enthusiasts, the people who are going to find those killer apps.
I would actually even maybe very simplify it to people who have needs and people who have wants.
And essentially just kind of getting right in the middle of that.
That's a great way to put it.
Yeah.
And having a centralized place to both to express the need and put the solutions there for those as well.
The big point you were making is that we're trying to.
to get to this place where we can use cloud deployments, including cloud simulations as a way
to get there. What happens next after that happens? I think that once the cloud is cloud access to
a quantum computation that's available, then we can try to reinvent these algorithms as quantum
programs. So an example that a lot of people are thinking about is how do you make machine learning
go faster? Machine learning has an inner loop or an inner part of the algorithm that's an
optimization step. And the optimization step has to literally look through all combinations
of things to find the right answer. And so the cloud will help facilitate the discovery of such
algorithms. And then secondly, it'll help you couple to the classical computer so that you can
trade off between the classical one where you can use cloud computing resources for your
HPC, your high-performance computing, and you can couple it to a quantum computer.
So, okay, so what I'm basically hearing is that cloud while in classical computing took a while
to get there. Cloud is actually now sort of like a pseudo infrastructure. It's sort of a way basically
for us to stitch together the reality where we want to be. Yeah. It's a very natural
delivery mechanism. The natural delivery mechanism. So why does that matter? So that's because that's
actually more about the consumption model and more about the go-to-market. I mean, SaaS is really
unique capabilities in terms of the fact that you ship one thing. You don't have to support all these
on-prem. And that's a whole separate sort of discussion. I think here cloud alone is just very much
empowerment and the ability to get this in many people's hands. But all those other aspects will
layer on top once we get to the point that there are these killer algorithms. So beyond the delivery,
there's also this component, though, of stitching together this world of classic and quantum
computing, because it seems like the only way you'd be able to do that is by having cloud as a
connective tissue between those two worlds. Well, it's not the only thing. You need software.
And in particular, you need some way of handing off the computation from the quantum piece to the
classical piece. And Quill can facilitate this transfer of information from the quantum computer
to the classical computer and back. You can't actually store data on these quantum computers.
They're just literally right now, just compute engines, just an amazing compute engines.
Almost like a co-processor. Almost like a co-processor, right. So you have to have the data someplace
else. And so you need another place for the quantum computer to interface with to get that data
and to store its results. Pip it in, pipe it out. I think it's also important to understand that
the other reason for clouds, in days gone by, we would build.
these complicated machines and we would install them
on customer sites. Yeah, on premise. Well,
a quantum computer requires a cryogenic
cooling system. Okay. And it requires
a special thermal
and vibration stabilized platforms. The quantum process is like the size of a
quarter. The rest of it's the size of two or three
refrigerators to house it and keep it cool
at like a barely
above zero degrees Kelvin.
Way more complicated than classic computers. They're fragile.
You lose
cooling in the refrigerator, maybe you damage
the whole thing, right? So you never really, at least
the early days want to ever put those things on a customer site. You want to put them in a secure
facility someplace and provide cloud access or remote access to them. So that's the other
reason that it's the right delivery mechanism for this technology. Right. I want to actually
go back to this idea of the hybrid. It's just so fascinating to me because one debate that plays out
when you think of customers adopting next platforms. You have these early adopters who are risk
takers are going to get ahead. They're going to try to adopt the new thing before it so they get a
competitive edge. And then you have people who are to laggers and they sort of follow after
everyone else has done it. And then you have people in the middle, which I suspect is a reality of
the Fortune 500 Global 2000, where they really want to get ahead, but there's this tension
between adopting something new and sticking to what you know in the old.
And one of the things I think is fascinating is that people, a lot of people have done hybrid
cloud in classical computing as a way to sort of straddle both worlds and maybe not the ideal
because you actually want to leapfrog and go to cloud versus doing this intermediary step.
However, in the case of quantum computing, it's a necessity.
It's the only way to currently get there is what.
I'm hearing you say.
It's the right path to the first ones.
Just because we go back to the complexity of operating these systems that are essentially
physics experiments in a can.
But customers want the power of it.
They don't want the hassle of having to manage and operate these things.
Now we have this beautiful thing called cloud computing and cloud access.
And so we actually understand how to build that infrastructure and host those systems and
provide the right API calls and so on.
Even beyond just cloud, there's aspect of microservices that play.
plays into this very naturally because you can imagine quantum microservices that do a variety
of things. And that in a day where you have different servers doing different things,
you're just doing API calls, this would be just another server doing a unique type of API call.
Yeah. And I think it's actually interesting because we talked briefly about VMs and virtual machines
earlier. And it's a next phase breaking things down to that sort of micro level.
But the other thing that I think is really fascinating is how that plays out organizationally,
because then you have software developers and product managers who are essentially running their own little
business units, mini business units for owning their own project soup to nuts because it's sort of
self-contained in these little containers. And they'll bring in things as needed. Well, I love
hearing that too because it's actually a way for big companies to actually embrace these experiments.
Without even having to know what's sort of inside the box. Yeah, exactly, because you don't have to.
You really don't have to. That's not the whole point of the whole cloud in the first place, right?
There's kind of another way to think about cloud too, which is this vertically integrated stack of
technology. So at the very bottom, you have, you know, the compute power itself and the
operating systems. And then it'll tear up from that. You've got.
some intermediate programming layers.
And at the very top, you have API services and microservices and things like that.
And there's going to be a bunch of other hardware that's classical in a cloud, like your network stack and your storage stack and your other classical compute stack.
On the software side, though, it's literally a soup to nuts ground up effort where you have to build the operating system to run the quantum processor.
Then up above that, you need to actually build the quantum algorithms and the quantum programming language so that you can program the quantum algorithms.
Once it's all in one place, you can kind of provide independent access to those different tiers, depending on who wants to do stuff.
You can basically interact with that tower of software in different ways depending on the level of granularity you want.
One question I have is, when I think of a vertical stack, it seems like that's a problem that's too big for a single startup to tackle.
Like it's something that a Google can do and IBM can do.
Why would a startup be able to do this?
Vertical stacks are very difficult to build.
So I agree with you about that.
But there's a couple different things that make them difficult to build.
One is just the rate at which you can iterate on the different components to see what the verticalization looks like.
You might guess incorrectly and usually do guess incorrectly that these layers actually are next to each other.
But with rapid iteration, you can find out exactly what the layers are.
So the closer they are, the more you can iterate.
Yeah, just the discovery of what those layers are and what the right layers are is an extremely important problem.
And I think a startup has a massive advantage of just because startups are essentially engines for agile engineering.
They themselves are optimization problems.
I think it's maybe even more essential for building a quantum computer because there's so many different pieces that need to be quickly iterated on to figure out how they fit together.
And this is the difference between engineering and science.
The hard science problems have been solved about how do you build a superconducting cubit and how do you send it radio frequency data to program it?
The hard part now is going through all the different ways that these things can be hooked together to build a reliable machine.
There's an engineering problem now.
But it does require rapid iteration and fast agile engineering teams in order to solve that problem.
Okay.
So for people who are developers or engineers thinking about getting into quantum computing,
what does it mean for the people who are actually working on this stuff and who are trying to adopt this stuff,
wherever they are in that cycle of wanting to get ahead or catch up later?
Well, I'll use an analogy from the early days of Intel when they invented the microprocessor.
There was no engineer that was called a microprocessor engineer.
It was a combination of electrical engineering and fabrication technology and
maybe even some computer science.
And so Intomated investment to build a whole team of microprocessor engineers to figure out
how to build the first microprocessor and actually build the successive generations.
And why does it matter?
It matters because the first thing you build isn't the thing that is usually the thing that
dominates the market.
You have to do many iterations of it to finally get there.
You need a combination of skills in order to be the quantum engineer that's going to build these machines.
You know, the 4,004 microprocessor, the very first microprocessor was built, as I understand it, for a calculator.
But it had that early application, which could get into market.
And I think quantum chemistry could be that application.
That's something now that one can do, but there's limits in the accuracy due to the expense in time.
So for n atoms, the most expensive algorithms, scale like n factorial.
Now, there are more efficient algorithms that go like end to the cubed or end to the sixth.
But those types of algorithms are often not sufficiently accurate to go after areas where real chemistry happens, where bonds break or you look at excited states of electrons.
We're talking about nature as quantum computing.
It's like what nature really does.
Yeah, and especially a lot of interesting applications from a commercial point of view are enzymes, all the chemistry.
So this is something where a quantum computer could be able to do calculations that are either much bigger and much higher accuracy,
where actually would be essentially much bigger at much at much higher accuracy.
And with a state of a machine that doesn't have to be this 100,000-cubit machine to do something interesting.
It could be much smaller.
And isn't there also actually stuff you cannot even do in classical computers today for computational chemistry?
Well, it depends on the number of atoms.
So, like, you can do this full calculation for like tens to maybe 100 atoms, but you probably couldn't.
It'd be very difficult.
I think you'd do it for 1,000 atoms or 10,000 atoms.
I mean, because chemistry is everywhere.
And you think about energy, when you drive a car, it's going through chemical reactions.
When your proteins and your body are working, there's a chemical reactions.
I mean, your plants are growing and using fertilizer and all this stuff.
It's all chemistry.
There could be one that gets into market.
And that we start seeing use and uptake.
And that's where it starts getting interesting.
Because once there is an application and these things become cheaper and more ubiquitous,
I think we're going to see the second, the third, the fourth, and fifth.
And it's going to roll from there.
One of the surprising aspects of quantum computing is that scientists and mathematicians and computer and computer engineers don't really know all of the problems that can be solved by a quantum computer.
We just know some of them.
And that's not how classical computing work.
It was always the case that if you could solve a classical computer problem on a classical, a small classical computer, then it would just automatically work better on a big classical computer.
But we don't actually know that class of problems that can be solved on quantum computers yet.
and that itself is a pretty interesting mystery.
Well, that's great.
Thank you guys for joining the A6 and Z podcast.
Thank you.