a16z Podcast - a16z Podcast: Quantum Computing, Now and Next

Episode Date: May 13, 2017

Moore's Law -- putting more and more transistors on a chip -- accelerated the computing industry by so many orders of magnitude, it has (and continues to) achieve seemingly impossible feats. However, ...we're now resorting to brute-force hacks to keep pushing it beyond its limits and are getting closer to the point of diminishing returns (especially given costly manufacturing infrastructure). Yet this very dynamic is leading to "a Cambrian explosion" in computing capabilities… just look at what's happening today with GPUs, FPGAs, and neuromorphic chips. Through such continuing performance improvements and parallelization, classic computing continues to reshape the modern world. But we're so focused on making our computers do more that we're not talking enough about what classic computers can't do -- and that's to compute things the way nature does, which operates in quantum mechanics. So our smart machines are really quite dumb, argues Rigetti Computing founder and CEO Chad Rigetti; they're limited to human-made binary code vs. the natural reality of continuous variables. This in turn limits our ability to work on problems that classic computers can't solve, such as key applications in computational chemistry or large-scale optimization for machine learning and artificial intelligence. Which is where quantum computing comes in. But what is quantum computing, really -- beyond the history and the hype? And where are we in reaching the promise of practical quantum computers? (Hint: it will take a hybrid approach to get there.) Who are the players -- companies, countries, types of people/skills -- working on it, and how can a startup compete in this space? Finally, what will it take to get "the flywheel" of application development and discovery going? Part of the answer comes full circle to the same economic engine that drove previous computing advances, argues Chris Dixon; Moore's Law, after all, is more of an economic principle that combined the forces of capitalism, a critical mass of ideas, and people moving things forward by sheer will. Quantum computing is finally getting pulled into the same economic forces as well.

Transcript
Discussion (0)
Starting point is 00:00:00 The content here is for informational purposes only, should not be taken as legal business, tax, or investment advice, or be used to evaluate any investment or security and is not directed at any investors or potential investors in any A16Z fund. For more details, please see A16Z.com slash disclosures. Hi everyone, welcome to the A6 and Z podcast. I'm Sonal. Today we're talking about one of the most exciting advances in the history of computing and next platforms, quantum computing. We start by talking about the almost impossible march of Moore's Law, going beyond debates around whether it's reached its limits finally or not, to what that means, not just for the potential of quantum computing, but also advances in parallel computing from machine learning and GPUs to FPGAs. And more broadly, how the economics of all this continually change who gets to and how. how we innovate in the computing industry. And then we cover what is quantum computing and where are we right now in the practical reality of what people can actually do with it, including what the first applications will be, especially given what classical computers can't do, and who are the players in this global race? Our guest on this podcast is the CEO and founder of quantum computing company Rigetti
Starting point is 00:01:09 Computing, Chad Rigetti, in conversation with A6 and Z partner Chris Dixon. So let me let's start with kind of where are we in the, in the history of computing. Modern computers started in probably for real World War II-ish 1940s. You know, the PC revolution began the 70s and 80s, internet in the sort of 80s, 90s, mobile phones now, right sort of in the heart of that revolution. And, you know, you're working on this kind of new thing, quantum computing. Yeah, there's been several mini-revolutions in computing capabilities at the hardware level and at the software level. I think there's been a few kind of inflection points in that. And one was,
Starting point is 00:01:47 back in the late 50s. We figured out for the first time how to wire together many transistors on a single chip. And this was the invention of the planer integrated circuit. Over the past 50 or 60 years
Starting point is 00:01:59 since then, we've had Moore's Law scaling of those silicon-based devices that have led to really an almost impossible scaling of the power that these microchips have. And they have completely changed the world.
Starting point is 00:02:13 But that really is just kind of the substrate layer of computing technology, really the chip level Now, over that time, we've gone from tiny chips with a few thousand transistors to chips with billions of transistors on them, and the size of those transistors have shrunk by many, many orders of magnitude over that time. And we're now at the point where individual transistors are about 10 or 20 nanometers in size. To put that in context, a human hair is about 10 or 20 microns, I think, so a thousand times larger, and 10 nanometers is about 100 atoms wide. And a transistor, by the way, corresponds to a two-inch vacuum tube.
Starting point is 00:02:50 If you see those computers from like the 1940s, that two-inch thing now fits in the scale of thousands in a human hair. Yeah, and if we zoom out in terms of layers of abstraction, what is this transistor and what are we using it for in any case? The transistor is the core logic element on a chip. And traditional computing works by encoding information in zeros and ones, in digital binary. and those transistors represent that information. So you can represent a massive amount of information on these chips. And we've also learned how to wire together millions of these chips and millions of processors in parallel to build large-scale supercomputers.
Starting point is 00:03:28 We're also at a point, though, where the ways in which we've been making those transistors more energy efficient, so you can pack more of them more densely on a chip, making them smaller, and are starting to run into fundamental limits. Well, particularly the shrinking, the fabrication technologies, are hitting physical limits, correct? So that is one, but there are many, there's a constellation of challenges
Starting point is 00:03:48 that it's more than just a physical size. One is the power density on a chip. So when you switch the memory state of a transistor, you generate some heat and that heat has to be extracted from the device to keep it from melting. And that problem gets really hard as you pack them more and more densely.
Starting point is 00:04:04 Another challenge is, if you want to build a supercomputer out of these things, you need to use many millions of processors in parallel. Well, that was one of the responses to the diminishing Moore's Law, right, with multi-core processors parallel system. Your typical MacBook today, I think,
Starting point is 00:04:19 has multiple cores, your iPhone does. But these are essentially multiple computers running in parallel. And then that at a larger scale is a data center, which might have 10,000 of these or something. But those also have limits because these things have to communicate with each other. And there's just diminishing returns, right, as you connect more of these together. In the same way that if 100 humans working together
Starting point is 00:04:40 aren't a hundred times more efficient than one human. This is actually described by something called Amdahl's Law, which is the lesser-known but perhaps even more important today cousin of Moore's Law. And what it means is as you start to paralyze a computation across many, many processors in parallel, you get a diminishing return because not every step in the computation can be effectively paralyzed. Some of them just have to happen serially. And so the basic approach of building more, more powerful computers,
Starting point is 00:05:06 is starting to hit some real limits. And the physical size of the transistor is one. that leads to from an economic perspective is that the cost of building the manufacturing infrastructure, the cost of putting up a fab to build technology at the 10 nanometer, 20 nanometer scale is extremely large. We're talking tens of billions of dollars now to get to the latest generation of technology. And there's very few organizations in the world that can afford to do that. So the competitive dynamics have been shaped by this economics. One counterargument to this is from the outside, Moore's Law looked like a law of nature.
Starting point is 00:05:40 From the inside, if you talk to people that worked at companies like Intel, they would say every time they were, you know, they felt like they'd hit the limit, and then somebody came with a breakthrough, and it felt maybe they'll continue to come with breakthroughs, number one. Number two, people say A6, more specialized chips. If you look at your smartphone, you have a video processor, comms processor, all these other things, maybe you will get more specialized chips and keep Moore's Law going for another few decades. Yeah, and I think it's possible that it's going. going to continue, I believe there's a seven nanometer node target from Intel. The cost is monumental, and we're at a point of very significant diminishing returns. And so what's happened in the industry over the past five years or so is just relying on brute force acceleration through improvements at the hardware level, at the integrated circuit level,
Starting point is 00:06:27 have kind of slowed down, and people have looked for other ways to accelerate data processing. And one on triggering here is that people confuse Moore's law with Dennard scaling. You know, So, Dendor scaling is a law of the sort of physical law around transistors getting, not physical law, but physical, like, the pattern of packing more transistors into a smaller space. Whereas Moore's law, really the spirit of Moore's law is an economic principle. Yes. Which is when the computing industry really cares about something and the economic engine gets going, things tend to get better very quickly. And so you see this with networking, you see this with storage, you see this sort of kind of across the board, right? One thing happening now is the computing industry is getting very excited about things like machine learning.
Starting point is 00:07:07 And then there's people like you working on kind of the next generation of things like quantum computers. And so, you know, if you sort of think of it Moore's Law kind of writ largely, it's this broader principle that like this whole broad system of capitalism plus research plus, you know, lots of smart people, plus critical mass of ideas, plus a whole bunch of other things has led to just this very, very steady kind of rate of improvement. Yeah, and I think what we're seeing now is that quantum computing is getting pulled into that, into that ecosystem and is beginning to be driven by the same economic forces that have been driving other forms of technology thus far. And so what's happened right now as a result of all this economic pressure on the semiconductor industry. There's effectively a sort of Cambrian explosion happening because companies can, you know, invest $10 million in manufacturing infrastructure and build individual chips that are close to rivaling the capacity of an entire supercomputer. And that supercomputer, the chips and that were built on a $4 billion fact. And it's not just quantum computing is really going to drive another acceleration of the pace of advance in computing capability. It's also neuromorphic.
Starting point is 00:08:14 And we're talking about neuromorphic chips in individual handsets being available soon for machine learning. Invidia has built an incredible business around GPUs, which really kind of owns a parallelization of tasks across a small number of processors, hundreds or thousands of processors in a single die. And then obviously, FPGA-based computing has been a significant advance as well with Microsoft embedding embedding FPGAs in Azure Cloud servers as well. So modern computing has evolved to such an incredible level of performance. It's really shaped the world. But ultimately, there is a conversation that is not happening today around all the things that computers do not do. And while our laptops and our supercomputers and Amazon Web Services are amazing computing resources, there are a bunch of things that they simply cannot solve.
Starting point is 00:09:06 And the reason that's the case is because they compute and they operate in a manner that is very, very, almost very, very dumb. in a sense. They map information into digital binary. And there's nothing in the universe that computes in a similar manner, except for our own computing technology today. And ultimately, the universe itself, and nature at the lowest level, operates on quantum mechanics. And that's kind of the machine language that nature uses. So tell us what is quantum computing. And maybe if you could go back a little bit or 100 years or so when, and like just briefly talk about if you could, the kind of history of quantum physics and how that leads to quantum computing. So quantum mechanics is a theory that's now over 100 years old.
Starting point is 00:09:42 was developed in the first two decades of the 20th century, and for a long time, it was really instrumental in understanding nature for very long time, but is now at the point where we are able to build machines that explicitly behave according to the laws of quantum mechanics rather than classical Newtonian physics, and we're able to control those systems in the laboratory, and we're able to build artificial quantum systems on a chip, and to control the quantum mechanical states of those devices. So what quantum computing really comes down to is encoding information in quantum mechanical states of nature that we can control and deterministically steer to represent data in a computation. And why is representing data in a quantum particle or quantum state?
Starting point is 00:10:29 Why is that advantageous to the traditional method? There's really two core reasons that it comes down to. The first is that quantum mechanics, quantum mechanics is a continuous theory, and quantum variables are continuous variables. So what quantum computing allows you to do is to compute with continuous variables rather than digital binary. So any fraction from zero to one as opposed to any decimal, any real number line, ultimately, you know. As opposed to just two digits. Yeah, yeah, exactly. And the second is that the number of such variables that we have access to to encode information in a quantum system grows as an exponential function of the number of quantum bits on the chip. This is completely different than how
Starting point is 00:11:13 traditional computing works. So if you have a chip with a million transistors on it and you add one more transistor, you go to a million and one, then you have a part per million performance increase in that chip, roughly speaking, in the best case scenario. And with a quantum computer, if you have, if you have 100 qubits and you add one more, you don't have a 1% performance increase, you double the performance. And that persists independent of the memory size. So every quantum bit you add to the system doubles the number of continuous variables to which we have access. And what it means is what appear to be rudimentary quantum mechanical devices can encode a tremendous amount of information
Starting point is 00:11:49 and can be used to compute things that are physically impossible to compute not only with today's supercomputers, but with any foreseeable supercomputer that we're going to be able to build in our lifetimes or anyone else's lifetime. What are some examples of computational problems that you could solve with a quantum computer and you couldn't with a classical computer? I think there's really two categories that we're seeing today where this is going to be taken up first for practical computing applications. And the first is, you know, very naturally derives from what quantum computers are. Ultimately, that's in computational chemistry. So in that world, you're using a quantum computer to simulate and understand another system that is itself intrinsically quantum mechanical. things like small molecules or materials.
Starting point is 00:12:34 And ultimately, what that's going to allow us to do is just get a much deeper understanding of how different molecular species are generated, what properties they have, in ways that are physically impossible to explore today because the combinatorics of molecular spaces is extremely large. It turns out that those equations, those schrodinger equation, which describes systems at the quantum mechanical level, is extremely hard to solve, even on a large-scale, classical supercomputer. We can write down the equations. We know what the equations, you know, we know how they behave. We simply cannot solve them for systems of meaningful sizes. A small molecule with something like 50 atoms is almost impossible to compute the exact molecular structure or the exact electronic structure. It's just, if you sort of graph the computing required with the number of molecules, it just gets, it gets unfeasible very quickly. Yeah. And the reason is because that small system is to some extent a small quantum computer. And it behaves according to the same
Starting point is 00:13:30 laws that give a quantum computer its power. So there's another area I talked a little bit about, earlier, about quantum computing allows you to encode information in continuous variables. And we're starting to discover ways in which we can take this compute power and map it on to optimization problems that underpin a lot of machine learning. And over the next few years, the quantum hardware that we're building is getting better at such a fast rate that we're reaching this point where the bottleneck is going to be understanding the best algorithms to run on those machines to get the most value out of that given compute resource that the quantum chip provides.
Starting point is 00:14:02 And part of what that implies is that you need to build a very sophisticated classical computer around the quantum computer to both leverage its resources and to offload anything from that quantum computer that can be offloaded so that the quantum computer is doing the things that only it can do. So there are two classes of applications. One is our systems in nature, which themselves have quantum properties, and the second are kind of more, you know, classical computing problems that are just are so difficult, so complex, that they are unfeasible for current systems.
Starting point is 00:14:35 And so that would include things like, you know, machine learning and other kinds of optimization problems. Yeah, and large-scale optimization problems. Now, I'm always very careful to predict what the applications of a fundamentally new and very profound technology are going to be. There's always stories, you know, retroactive stories you're going to be able to tell about the lack of vision that people show. When you go back and look at like the early 1980s, everyone talked about how the only use for computers was like recipes and like keeping your recipes and they had a whole bunch of predictions if you look at the old like ads, right? And very few of them predicted Facebook and Wikipedia and YouTube and all these other things that we use all the time. I would argue that the past 30 years have shown us that humans are amazing for learning how to use computers to the most significant impact in their lives. And as we build these systems and as the industry itself develops, I think one of the things that I'm most excited about is watching the unforeseen application and start to materialize.
Starting point is 00:15:33 Well, there's kind of a yin-yang here, right, where so much work, if you go to a typical computer science department, there are people working on better chips and things like this. There are also so many people working on algorithms, but they're all working on algorithms for classical computers. There have been a few people working on quantum algorithms, for the most part, they haven't been focused on that because they don't have those computers to work. on and test on yet. And so therefore, you don't know even what the algorithms and what even that layer is going to look like, like the programming languages and the algorithms and everything else, let alone the end user applications, right? So we know enough to know how much we don't know.
Starting point is 00:16:06 And people have developed some early applications that will be able to run very early and near term quantum hardware. And these are predominantly quantum classical hybrid algorithms where you use a quantum computer to provide directionality into an optimization loop that you're running. running in conjunction on classical hardware. And that's a really exciting application because it really puts the quantum processor in a position where it's doing the thing that it's exceptionally good at. And same for the classical computing hardware that you have.
Starting point is 00:16:34 It's a little bit like a CPU, GPU, and then you'll have your quantum PU. Exactly. Think about quantum computing as providing a new kind of computing capability that will be deployed in a heterogeneous computing environment. And we've worked really hard to develop software that, that, allows us to integrate our quantum computing capabilities seamlessly into existing classical cloud infrastructure. We've developed an instruction language that allows you to write simple programs that target both classical and quantum computers in the same instruction.
Starting point is 00:17:04 So where are we on this? People have been talking about quantum computers for a long time. And there's been various debates as to, you know, how it's progressing. What's the state of the world in quantum computing right now? Quantum computing is arguably the most sophisticated technology that humans have ever developed. We're able to leverage a physical theory that we as individuals, never see on a day-to-day basis because the world averages over all of that kind of quantum mechanical behavior, and we just get the Newtonian universe. And so it's extremely hard to build these chips and to have the quantum mechanical effects that you utilize in a computation, to have
Starting point is 00:17:36 them persist for a meaningful amount of time. And that was the real bottleneck in the field for a long time. This is coherent. This is quantum coherence. When I started my PhD in 2002, I think there was one or two groups in the world that had ever built and demonstrated a superconducting cubit with a measurable coherence time. So qubits are the fundamental elements of a quantum computer. And it's used both for the mathematical abstraction that algorithms theorists can use to develop an idealized two-level quantum system that has two-available states. At the same time, a qubit is also used to represent the physical instantiation of that two-level
Starting point is 00:18:10 quantum system. And this is very different than how we talk about classical computing. In classical computing, we talk about bits and transistors. So the bit is the logical element, the transistor is the physical element. The qubit represents both. And the nomenclature that we have today uses qubit in this kind of double meaning. And so we use superconducting qubits to represent, to manifest these two-level quantum systems that we use to encode information. When I started my PhD at Yale, the field was at a state where quantum computing was the excuse to do this fascinating physics research.
Starting point is 00:18:43 But there are very few people who were thinking seriously at that stage about building a real quantum computer. We spent about 10 years as a community as a whole really demonstrating that we could increase the quantum coherent lifetime of the devices to the point where they could be used for a computation and then learning how to solve the fundamental problems about putting more and more quantum bits on a chip. And so where the field is today is that we're really working on packing enough quantum bits onto a single chip where you can run a useful computation and to simultaneously increase the quality of the quantum Boolean operations that you're. you do during the computation. So the error rates are sufficiently low that the computations are reliable. So how big is the quantum computing industry slash research world right now? Like how many people are working on these kinds of problems? As a field of physics research, the community has grown substantially over the past 10 years or so. There's probably thousands of people around the world that would identify as researchers in quantum computing.
Starting point is 00:19:39 In terms of real effort to build practical quantum computers, it's a much smaller universe. And of course, IBM has a very significant effort in this. Will has a significant effort, and there's amazing scientists and researchers in both of these places. Microsoft, more recently, has gotten involved and has started to make significant investments in quantum computing. And then around the periphery, there are smaller organizations that exist in the ecosystem that are doing some combination of research and in some cases building software, tools, or working on developing potential applications for long-term quantum computing. Quantum computing is very much a global effort. There are significant efforts in Australia and in Western Europe. There are extraordinary people at each age Zurich at Technical University of Delft and all over the place.
Starting point is 00:20:24 You obviously can't name them all. There's also signs of significant progress in China. We recently saw a paper with a multi-cubit experiment that was successfully run by a Chinese group. This is a global race in many ways. And quantum computing is going to reshape the world in a significant way, I think, because the impact of this technology will be profound and will be felt across industries and around the world. world. There's going to be another Silicon Valley where the quantum ecosystem is, it kind of comes up. We often use that term every day and it doesn't sink in that, hey, it's called Silicon
Starting point is 00:20:55 Valley because of silicon microchips. So I picture a quantum computing company, I imagine a bunch of physicists. Tell me about who works at Rigetti Computing. So we are a full stack quantum computing company. We design and manufacture quantum integrated circuits. We we integrate these quantum integrated circuits into a complex system that cools them and then operates them using a microwave and RF control system to run computations on those chips. And then we have a software platform that connects up that quantum computer to cloud infrastructure and allows you to run quantum algorithms on that machine. There's a lot of physicists.
Starting point is 00:21:35 Physicists at various stages in their career, we have what we call junior quantum engineers who are just coming out of college and really great and brilliant young physics majors. We have theoretical physicists, experimental physicists. We also have computational chemists. We have a lot of technicians, electronics technicians. We've hired systems engineers from Jet Propulsion Lab and NASA. We've hired FPGA developers from the aerospace industry who were building autonomous drones before. It turns out that the kind of core technology problems that one needs to solve in order to build quantum computing are being solved in other places.
Starting point is 00:22:08 What doesn't exist, all those skills under one roof and one organization with all those people pulling on the same rope. We have incredible business and people operations folks. We have a lot of software engineers, and this is one of the most impactful things that a software engineer can work on today. You have the opportunity to make foundational contributions to an entirely new computing paradigm that will lead to fundamental advances in many different fields. When do you think regular companies, people will have access to quantum computers? The ideas around neural networks and deep learning have been around for 20, 30 years. People even trace the ideas back much earlier than that. And ultimately, from one perspective, it was the availability of phalanxes of GPUs over AWS that allowed this to really take hold because the number of folks who can contribute to improvements from an algorithmic perspective was significantly increased.
Starting point is 00:22:56 Quantum computing is at the early days where there's maybe a few hundred folks who are working on quantum algorithms around the world. And I would argue that every software developer, to some extent, is working on better classical algorithms. And over the next five years or so, I think the number of folks who identify as quantum engineers or quantum software. software engineers is going to go from approximately zero today to a meaningful number and that the progress on that front will really accelerate. And we're really focused now on developing applications and working with early customers in these core application areas that we discussed. And in really engaging with folks to kind of kick off the flywheel of application development and discovery. So like all computing platforms, there'll be sort of this mutually reinforcing interaction between
Starting point is 00:23:36 the computing platform and the software developer side. And that hasn't begun yet. you get these things in people's hands and you see what they can do with them and all the inventive things they come up with. Exactly. That flywheel won't start. So quantum computing sounds like a very hard research problem, and it's not surprising that IBM and Google and universities are working on it. How can a startup possibly, you know, compete against these giant companies? That's a great question. And it's something that I've thought about a lot.
Starting point is 00:24:02 And, you know, before I started the company, I looked around at the world. And ultimately, it's the kind of mission that is best served by building an organization from scratch, where you can kind of hand select or curate the DNA of the different organizations within that larger company you've got to build to uniquely position it to solve that set of technology problems at this point in history. And that opportunity to build a company from scratch is very hard. And there's a chasm you have to cross to get there. But if you can do it, it gives you a compelling competitive advantage against a larger existing incumbent organization whose quantum computing effort is not going to move the needle in their culture.
Starting point is 00:24:37 Think of this in analogy to electric cars. General Motors are building electric car, and still, there's got to be a Tesla. And eventually electric cars and hybrids are going to kind of be a technology that is adopted across the industry. But there's one electric car company. There's one that matters. There's an economic angle to this, too. And the economic angle is that quantum computing sounds hard, but it is very much a, you know, we're knowledge workers. And ultimately, it's the knowledge of how to build this technology that sets you apart.
Starting point is 00:25:07 And that is not something that can be reproduced at this stage in the industry with mere scale. An army of fabrication process engineers is useful only if you have the foundational knowledge about what it is you're trying to accomplish and how to diagnose whether you've done it or not. Okay. So it's a persistent rumor on the Internet forums. It is that quantum computers will destroy all of our cryptographic systems. What do you think is going to happen there? What you're referring to is Shores algorithm, ultimately. And in 1995, a mathematician at Bell Labs named Peter Shore discovered an algorithm that if one could build a large scale quantum computer, one could run an algorithm that would be able to factor large numbers in polynomial time.
Starting point is 00:25:48 What that means is that you'd be able to threaten the standard encryption protocols that are used around the world from Wall Street to the battlefield. We're probably 20 to 30 years away from having a machine that would really be able to run Shor's algorithm on practically relevant problem sizes. At some point in the future, quantum computers will be able to crack RSA encryption. And so the question really becomes, what is a shelf life of your secrets? Is the blessings for the field? Because that discovery led to research investment from the government that got the field started. It's a curse to some extent because that application, from my perspective, is one of the least interesting. It's not as interesting in relation to the other things that quantum computers are going to help us do.
Starting point is 00:26:28 Ultimately, the things that we get really excited about are using these machines to build fundamentally more powerful, artificial intelligence, using these machines to disrupt wet chemistry and to do simulation-driven design in silico of new materials or new drugs, this is going to significantly affect health care. It's going to affect how we treat disease. It's going to affect how we generate energy and how we feed ourselves as humans. Okay. Great. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.