@HPC Podcast Archives - OrionX.net - @HPCpodcast-79: Travis Humble of ORNL on Quantum Tech
Episode Date: January 24, 2024We discuss the state of Quantum Information Science with our special guest Dr. Travis Humble, a global authority on the subject, director of the Quantum Science Center, a Distinguished Scientist at O...ak Ridge National Laboratory, and director of the lab’s Quantum Computing Institute. [audio mp3="https://orionx.net/wp-content/uploads/2024/01/079@HPCpodcast_Travis-Humble_ORNL_Quantum-Tech_20240123.mp3"][/audio] The post @HPCpodcast-79: Travis Humble of ORNL on Quantum Tech appeared first on OrionX.net.
Transcript
Discussion (0)
I would say if anything in the past five years that pace of development the rate
at which we are moving forward has actually increased so from my
perspective I would say I'm very optimistic at the moment about the state
of the field as well as its development and near-term ability to deliver on some
of these key outcomes that people have talked about.
Well, unfortunately, some of the materials we are trying to build, we will never be able to simulate,
even on today's supercomputers.
They are simply too complex in the way that the atoms and electrons are interacting with each other
that our conventional technologies cannot track those systems.
But you might expect that a computer that is fundamentally built on the principles of quantum mechanics
can actually mirror those systems almost perfectly.
My expectation is that the future of quantum is to have a diversity of technologies.
We will find that each of these modalities actually fits certain use cases.
From OrionX in association with Inside HPC, this is the At HPC podcast. Join Shaheen Khan and Doug
Black as they discuss supercomputing technologies and the applications, markets, and policies that
shape them. Thank you for being with us. Hi, everyone. Welcome to the At HPC podcast. I'm Doug Black with Shaheen Khan. And with us today
is our special guest, Travis Humble. He is director of the Quantum Science Center,
which is a Department of Energy funded partnership comprised of leading academic institutions,
national labs, and corporations. He's a
distinguished scientist at Oak Ridge National Lab and director of the Oak Ridge Quantum Computing
Institute. He holds a joint faculty appointment with the University of Tennessee Bredesen Center
for Interdisciplinary Research and Graduate Education, working with students on energy
efficient computing solutions.
So Travis, welcome. Oh, thank you, Doug. It's a pleasure to be here. Okay, so tell us a little
bit about your areas of focus in quantum and some of the R&D work you've been involved with of late.
I've been working in the area of quantum science and technology for, well, about two decades now.
A lot of that started with my initial research in
quantum chemistry and how to control chemical reactions. But when I came to Oak Ridge National
Laboratory almost 20 years ago, I immediately got immersed in the area of quantum computing.
And of course, this is a really hot topic at the moment, the idea that we can use quantum
mechanics to perform computation. My research in this area has kind of covered the gamut of topics in computing,
including both the development of algorithms and applications for quantum computing systems,
software and programming tools that can be used to develop those applications.
And then most recently, I've been looking at the development and testing and evaluation
of quantum computer
systems that are available today. Alongside of my research program, I am also leading the
Quantum Science Center's research agenda, and this is broadly focused around the idea of building
new types of quantum technologies, especially the development of quantum materials for topological
qubits, the development of new types of quantum
simulation platforms, and this is using quantum computers to specifically investigate chemical
and material processes, and then the development of quantum sensor platforms, which can be
applied to really exotic phenomena, including the search for dark matter, as well as the
development and demonstration of quasi-particles and quantum materials. So this position in particular has given me a really broad perspective on all the
different types of topics that are under consideration today in the field of quantum
science and technology alongside my past research in that area.
Okay. Well, stepping back from a 30,000-foot level, I always ask quantum people this. What
are your thoughts on the state of quantum and whether the technology is moving forwardly in an encouraging way, or is it
somewhat stalled with all the developments and announcements going on? How do you view
the forward movement of quantum? Having been involved in the field for almost 20 years now,
I can definitely say that we are progressing in a forward direction.
We have gone from hypothetical conceptual ideas of what quantum computers are to today having
access to working systems that can be tested. We can try to evaluate their feasibility for solving
challenging problems, and then we can even envision how they may grow and scale up over time. I would say, if anything, in the past five years, that pace of development, the rate at which we are
moving forward, has actually increased. So from my perspective, I would say I'm very optimistic
at the moment about the state of the field, as well as its development and near-term ability to
deliver on some of these key outcomes that people have talked
about for quantum computing and other quantum technologies. It is not a simple field, though,
in the sense that it is integrative. It does require multiple disciplines working together
to try and accomplish these goals. And it is emerging from a long history of physics and
making the transition over into a working viable technology.
And so the fact that that could even happen on a few decades time scales, not, you know, to speak
of the additional time that'll be needed, I think that's really remarkable. And so I'm both
encouraged and optimistic about the rate at which things are moving forward.
That's awesome. Travis, you mentioned sensing, and that led me to at least three or four branches of
quantum science and technology, quantum sensing for measurement, quantum communication, quantum
computing. I imagine all three are part of the projects that Oak Ridge and you pursue,
or is this a focus on a particular aspect of these?
Yeah, so we have a very broad program in the quantum science and technology field.
Certainly quantum computing is at the forefront of this alongside quantum communications.
But as you rightly noted, quantum sensing is actually one of the areas that has enormous
near-term potential.
In a sense, quantum sensors are taking advantage of the physics of quantum mechanical systems
in order to improve the resolution and the precision with which we're making measurements. When we apply that to novel
areas like searching for dark matter, which is one of the yet-to-be-discovered particles that
could possibly contribute to the state of the universe, we know that we're going to need new
types of sensing platforms that push well beyond our current limitations.
Quantum sensors are one path forward in that area. And for Oak Ridge and the Quantum Science Center and many others in the field, the development of these new quantum sensors requires us to bring
together existing ideas of signal processing theory and sensors and measurement, but now
framed in this new framework of quantum information
and how it can actually exceed our conventional expectations.
Brilliant.
So if we focus on the quantum computing part, there are applications that I see mentioned
in the press that look more like a quantum experiment rather than really quantum computing.
They sort of seem to lack the programmability aspect that
the word computing would imply. Where do you think that is in the current state of things?
This is an excellent point. The differentiation right now between a quantum experiment and a
quantum device operating is very murky and a bit ambiguous. Oftentimes, we are finding that experimental physicists who
traditionally have stood up these systems in their laboratories as point of proof concepts of key
ideas from physics are now making the transition into persisting those systems for longer amount
of times and allowing other people external to their laboratory to have access to them.
The ability to transition the technology is an entire endeavor on its own.
It touches on many of the topics that are traditionally well outside of science itself.
So I think it is good to point out that there is some ambiguity right now in what's an experiment versus a product, let's say. But that's also part of the excitement,
is that by making these experimental discoveries,
we're actually pushing forward the frontier
of what we can do with quantum physics, quantum materials,
all these different systems that are out there.
And by providing people access to them as quickly as possible,
we're simply going to feed back into
that cycle of discovery, enabling people to now take those systems and program them or test them
or evaluate them in these application areas. So I think you're exactly right that there is
not always a clear line between what's an experiment versus what's, say, a product.
But the truth of the matter is that's really where a lot of the excitement's happening because we're able to get such a quick feedback
cycle in terms of our discoveries of what we can do on these systems versus how we want to design
the next generation. Travis, help us, and I'm sure a lot of people among our listeners, what's a good
way or the best way to describe what a quantum computer does and also why it has the potential to be so
much more powerful than classical HPC? Yeah, this is always a tricky question,
in part because when we talk about computers, conventional computers today, there's a great
diversity of them. We all have a general notion that, yes, they're performing some type of
computation, adding up ones and zeros quickly
for us and doing that with remarkable technology underneath them. With quantum computers, we're
actually trying to extend that definition. We're trying to say that in addition to all those
remarkable things we have access to now, there's a whole new set of opportunities that are available
by harnessing the laws of quantum mechanics.
And so in the way that we traditionally think about adding ones and zeros together,
we now think about superpositions of ones and zeros. And that, of course, kind of requires
a stretch of our imagination for those who aren't familiar in the field. But what we have found
through studying this area is that there are great opportunities when we extend the definition
of operations that are available to a computer. I'll give you an example of this. At Oak Ridge
National Laboratory, one of the key areas we are interested in is understanding materials,
both their properties and their behaviors, but then also how we can create new types of materials
that provide certain designer functionalities. Well, in order to synthesize and characterize
those systems, it is an incredibly time-consuming process, and we spend a lot of effort in
demonstrating these types of measurements. But alongside that, we want to be able to build
computer models of these types of materials that can then guide us in those processes of
fabrication and characterization, and ultimately cut down that
development cycle in order to get to our sought-after material quicker. Well, unfortunately,
some of the materials we are trying to build, we will never be able to simulate, even on today's
supercomputers. They are simply too complex in the way that the atoms and electrons are interacting
with each other that our conventional technologies cannot track those systems.
But you might expect that a computer that is fundamentally built on the principles of quantum mechanics
can actually mirror those systems almost perfectly.
And so in this way, by building quantum computers,
we can actually create synthetic versions of quantum materials that could not have existed otherwise. And eventually in the
future, use that to guide our development of materials that can be used, let's say for room
temperature superconductors. And that's exactly the type of material that now has an impact on
how we transmit electricity without loss or enable new types of transportation. Now, that's a long
term goal for this type of research agenda. but it's the fact that quantum computers are moving beyond our conventional limits, that we're
now able to solve problems using the methods of quantum mechanics themselves that really opens up
this space in exciting and new ways. Okay. Now, we recently saw a headline that the biggest current
quantum barrier is noise. Is that the same as error correction, which we've heard for years is the biggest quantum barrier?
They're intimately connected with each other.
Fundamentally, when we try to control quantum mechanical systems,
we're having to build something, a tangible object, a material,
or perhaps we're trapping an atom or trying to create a photon.
And these quantum particles are not only very small and very sensitive, but the ability to control them
and bend them to our will is oftentimes a noisy process. So we try to create an atom at a certain
location in a material, and it's actually off by an angstrom or two. And that can have huge repercussions for how
that material behaves. So within quantum computing technology, we're actually now focused on how do
we mitigate the errors and the noise that's occurring through the processes by which we
create these technologies. Now, one of the approaches to that is something called error
correction, which is you actually create a reinforcement method that monitors the state of your material or your quantum
system.
And then when it detects an error, it will have feedback into how that system behaves
to try to correct for that error.
Now, the error correction methodologies that are available today and the ability to integrate
them into a system that behaves fault tolerantly, that is to say that it can tolerate the presence
of errors and faults, is a really complex problem and right at the state of the art
of the field at the moment.
So we have existing quantum technologies that we can learn from.
And one of the things that we are learning is that we are going to need error correction and fault tolerance in order to scale those technologies up to the
sizes and performance that we expect is necessary to try and solve some of these most challenging
problems. And it's possible to detect the noise and distinguish it from a proper manipulation
of the qubits. How does it know that this is noise and the other
thing wasn't noise? It was meant to be. Oh, yeah. This is really subtle. Quantum
mechanicals themselves are very sensitive to their environment. And what I mean by this is if I have
a material and it's not in an absolute vacuum and not interacting with anything around it,
then there's always going to be some transfer of energy and noise and information between that and its surroundings. The ability to distinguish that
type of process from something that I deliberately chose to do to that material is the balance
between the error correction paradigm and my straightforward operation of the quantum system
itself. So one of the engineering challenges,
both at the architecture level as well as the programming level, is to understand how to
interleave error correction and the operations that you want that system to do. By performing
error correction, I can, like you suggested, identify what were the mistakes that were made.
Not necessarily understanding how they were made, but I'll know mistakes that were made. Not necessarily understanding how they
were made, but I'll know that mistakes were present. And then by monitoring the operations
that I asked the system to perform, I can now drive it towards a given outcome. Maybe I'm
adding together multiple numbers or I'm trying to simulate a quantum material. In all of these cases,
I've got to interleave these two methods together. Now, one of the consequences of this is that I am now adding operations to my
system's execution. It's effectively, in order to combat noise, I've got to do more operations. And
of course, that can add noise to the system as well. So there is a trade-off in these approaches.
There becomes a point in the noise and the preparation of well. So there is a trade-off in these approaches. There becomes a point in the
noise and the preparation of the material where error correction does improve the system. When we
are looking at today's latest results from the field, we are finding that the existing quantum
systems out there are right at that boundary of being able to demonstrate improvement in their
operation by using error correction
techniques. And that's really exciting because it hints that there is a big jump in the scalability
of these systems coming up in the next several years. Like the benefit is growing faster than
the opposite of benefit. This is exactly it. There is a bit of a challenge in keeping up with that scaling, but the level of number of
control wires that you will need just to program and operate these systems, that's going to increase
as well. And so you have to worry about things about the controls and the circuitry and the
energy consumption and all of these aspects too. But the fundamental gain from error correction and being able to scale
these systems up, we're just starting to see experimental evidence that suggests this is
really feasible. Do you have any perspective on the different so-called modalities, as the industry
seems to refer to the word approach? Because it seems to me that if you start with just the fundamental particles, photons, atoms, ions, and as of recently, even molecules, that you could really harness their quantum effects. Is there any modality that looks more promising than others? Or are we still kind of in the discovery mode? I definitely think that all of the modalities under consideration are exciting
areas and certainly warrant continued exploration. I am hard-pressed to say that any one of them
actually stands out as being the number one contender across all use cases. Truthfully,
my expectation is that the future of quantum is to have a diversity of technologies. We will find
that each of these modalities actually fits certain use cases. And this isn't entirely foreign
from a computing perspective. When I look at a modern computing system, I have specialized
subsystems around memory and algorithmic processing and graphical processing and the communication of
data between systems. And that kind of diversity in the technology is something that we have not
seen in quantum computing and other areas yet. So I fully expect that we will see these other
technologies, all of these technologies, continue to develop and mature and that the real excitement
comes when they start to
be able to interact with each other. Because now it's a bit like having multiple types of materials
for building a new empire. And in this case, that quantum computing empire, I think, will make use
of all the different modalities that are out there. So this is almost like I have GPUs, I have FPGAs, I have other custom ASICs, and they all are good
for what they're good for, some more than others. But I, in practice, will need access to all of
them to be able to do everything. So in that sense, the quantum computer is not one thing,
but will manifest itself in multiple different ways. Is that a fair way of saying it? I think so.
I think the long-term architecture for these types of systems won't be made out of a single technology element
because there are competing concerns in the functionality.
In some cases, I need to very quickly process instructions.
So I'm going to want to use the technology
that has the fastest clock available to it.
But in others, I want to move information
very quickly, and I'm not necessarily trying to process things. So now I need something that can
move very fast, but doesn't have these other requirements on it. And of course, this gives
rise to the idea, well, now I've got to transfer information back and forth between these systems.
So it's certainly, in my vision at least, that the technology, quantum technology of the future, is a collection, very much like you were suggesting, of different components and different pieces networked together. That's not to say that there won't be specialty devices that are developed along the way, but in the grandest version of the future, I do think we need a diversity of approaches. Travis, you're director of both the Quantum Science Center
and Oak Ridge's Quantum Computing Institute. Could you give us thumbnail sketches of the
missions of those two organizations? Oh, absolutely. So the Quantum Science Center,
or QSC, is one of the Department of Energy's National Quantum Information Science Research
Centers. These are a research program established by the National Quantum
Initiative back in 2018. And the purpose of the centers is actually to tackle really substantial
problems, challenges in the field of quantum science and technology. QSC is one of five that
the Department of Energy has sponsored. QSC itself is a partnership of 16 different institutions, including industry,
academia, as well as other national labs. It includes almost 300 staff, students, and interns
working on a very broad and aggressive research program, partly looking at the development of
quantum materials for new devices, the development of quantum simulators for predictive capabilities
and scientific discovery, and then the development of new types of for predictive capabilities and scientific discovery,
and then the development of new types of quantum sensors.
And what is unique about QSC and the fact that it has all these different stakeholders
is that we're able to integrate them and focus on these three mission areas
and basically accomplish new types of approaches to quantum technology
that wouldn't have been available otherwise. Now,
the center and its sister centers that are funded by DOE are all under the umbrella of the National
Quantum Initiative, and so they're very high-profile and aggressive efforts. The Quantum Computing
Institute that is established at Oak Ridge National Laboratory actually has been around now for just
over a decade. Its purpose is to integrate
together internally our capabilities in the areas of quantum computing and computational science.
So, of course, Oak Ridge is home to Frontier, the world's fastest supercomputer at the moment. And
so one of the key questions that we are asking ourselves is what does the future of high
performance computing and supercomputing look like? And how does quantum computing as a technology integrate into that future
computational ecosystem? I certainly don't think quantum computers themselves displace
all of computing technology. Instead, it's more of a partnership and an integration of these new
types of approaches to solving problems with our existing workflows.
So the Quantum Computing Institute, or QCI, actually examines that particular issue,
both looking at the current quantum hardware that's available, but then also the types of workflows and application tools that we're going to need
in order to integrate this technology in the future.
That's brilliant. It really is such a great, great necessary thing to do. What do you
think the challenges are in integrating quantum computing into supercomputing? You've, you mentioned
about the application area, so that leads into the software infrastructure. But the other part that I
really wanted to press on, in addition to software, is training. What is the state of that? Is it
necessary to have a quantum mechanics, quantum physics background to be able to formulate the problem into something that quantum computers
can solve? Or is that something that can be abstracted away for more of a traditional
programmer types to use? Yeah, this is a great question and really an exciting part of what's
happening in the field at the moment. So just like we were talking about earlier, the transition out of the experimental physics laboratory into a computer lab or even data center is a pretty big jump for a technology or an idea.
And what comes with that is an additional set of concerns and as well as a lowering of the priority of some of the physics
concern that would have been dominated in the laboratory. But in the high-performance computing
environment, there are things that you are concerned about in terms of the operational
reliability, the power, weight, size, performance of these systems. How do you physically integrate
them together? Some of these technologies are very exotic and they come with them, safety hazards that aren't traditional for HPC environments,
as well as requirements on their performance and their behavior. And you need a workforce
that can both understand and monitor these types of requirements, but then also operate and
facilitate the usage of these types of facilities. So at Oak Ridge, as part of
the QCI and something we call the Quantum Computing User Program, we are actually using our access to
today's quantum computers to try and train ourselves on what these types of concerns are.
So some of this is learning the language of quantum computing, at least at the basic level.
No, not everybody has to have a master's in physics or electrical engineering to access these systems, but they
do need to understand some of the fundamental principles about what's important and when do
I know if things aren't working correctly. But even more fundamental to that is why do I want
to use a quantum computer in these types of facilities. Quantum computers themselves are not commodity devices. They are high performance platforms for solving select sets of problems. And for Oak Ridge,
scientific discovery and technology innovation are two of those areas. So by having our experts
in those areas look at today's quantum computing systems, look at their existing application
workflows, and understand how would I
fit this in there? Would this even provide me a performance advantage? Those are some of the key
questions, and that fits squarely in the type of work that we are used to doing, and many others
are, in terms of evaluating future architectures and how they could accelerate their current
workloads. So I do think that building a workforce that is knowledgeable about the
technology, at least at its functional and operational levels, is critical in order to
increase the adoption and ultimately the usability of quantum computing in these settings.
Travis, another topic that comes up is the idea of emulating quantum computers,
both to run applications that are already available but don't have a hardware to run on, and also eliminate some of the pesky problems like IO and
kind of management and stuff that is a lot easier on traditional computers. Looks like GPUs can
provide a avenue to put a dent into that sort of a solution. There's also the question of quantum
inspired algorithms that sometimes lead to
better algorithms, even for classical computers. Would you speak to how these two different avenues
can work in parallel to accelerate things? Yeah, so the field of quantum computing is doing more
than just advancing hardware and software paradigms, it's actually challenging our own understanding of what
computation is and what the limits of computation are. Certainly, some of the most surprising
outcomes from adopting quantum computing has been the recognition, in fact, that there are
conventional approaches or algorithms to solving these problems that had been overlooked in the
past. And these types of quantum-inspired approaches are entirely within reach of our current technology,
but it's by examining more broadly the field of computation through the lens of quantum
computing that we've actually been able to recognize them and then ultimately adopt them.
So I think just by being a foil for ourselves and our understanding of what computation is, quantum
information as a fundamental object is incredibly valuable.
But even moving beyond that, we have found that there are problems which are recognized
as not being solved with conventional techniques and certainly not solved efficiently.
And this is actually a bit more interesting in the sense that over the decades in
which we have developed conventional computing platforms, including supercomputers and commodity
devices, we oftentimes will encounter problems that we simply refuse to try to solve. Effectively,
we recognize that those problems are too hard. We have no efficient method for solving them, and then we can move on to
focused on more feasible solutions with our currently available resources. But quantum now
requires us to go back and re-examine that history. What were the problems and decisions that we made
about those problems that have led us to where we are today? And how does quantum as a technology, by opening up new opportunities, new solvers,
new methods, require us to now redo some of those decisions? And I'll give you kind of a detailed
example just to clarify the point. But within the field of chemistry, there's a very frequently
made approximation when you're trying to calculate the dynamics of a chemical reaction. And it's called the Born-Oppenheimer approximation. And basically what you're saying is that the rate
at which electrons move is much faster than the rate at which the nuclei themselves move,
because there's about a 2,000-fold difference in their mass. And if you make that approximation,
you can actually derive a lot of solvers for chemical reactions that are efficient
and easy to simulate on conventional computing platforms. If you didn't make that approximation,
it actually becomes a much more difficult problem to solve. With quantum computers, though,
ignoring that approximation, continuing to keep both the electrons and the nuclei as a single
composite system actually is an efficient way of solving the problem as well.
So historically, whereas we had viewed this particular type of approach of solving as
inefficient or intractable, now by having access to quantum technology, it opens up an entirely new
path to discovery. So I think that both in terms of the quantum-inspired algorithms, as well as in
terms of the new types of problems we can solve. Quantum itself is a really exciting area. Shaheen, I'm dying for you to ask Travis,
get into your thoughts about quantum parity. We hear so much about quantum superiority,
quantum advantage, and when that'll happen. But you've got some interesting points about
just bringing quantum up to parity with HPC. Yeah, Travis, we were talking about quantum
advantage and when to use
quantum. And of course, that's kind of been presented. And as you mentioned, there are
problems that are completely intractable and those that can go away faster, even though they can be
done with existing computers. But the idea was that in addition to performance advantage, there's
also an energy advantage for quantum computers. So if I can do half the
performance, but at one hundredth of the energy or one thousandth of the energy usage, maybe that's
sufficient advantage for me to use it. Is that part of your calculus when you look at these things?
Absolutely. Absolutely agree with this point. And in fact, I would say that my own framework
considers three specific metrics for evaluation of the value of
quantum computing. One is time to solution, one is accuracy of the answer, and the third would be the
energy consumption. And so like you've just said, if I'm getting the same time to solution and the
same accuracy, but I'm reducing the energy consumption by a thousandfold or even a hundredfold,
my quantum technology is suddenly worth quite a bit of money when I compare it in real dollars to operating, say, an HPC system. Similarly, even if I'm consuming the same amount of energy,
but if I can reduce the time to solution or improve the accuracy, there is a way to get a
value, an advantage from the quantum approach.
I think early on, the conversation was dominated around reducing the time to solution
because many quantum algorithms, they reach their advantage
by reducing the number of operations required.
But that's not the only way to win this game.
In fact, I would say computing as a field is actually facing a bit of an energy crisis.
When you look at the
amount of training and energy consumption required in the areas of artificial intelligence and some
of the modeling and simulation problems that are out there, we know that in order to create scalable
systems for the future, we've got to get energy under control. And quantum also offers advantages
in that area, I think. Exactly. You mentioned AI, and we also read quantum machine learning and the application of quantum
computing to AI.
One challenge to that and complexity to that is that AI traditionally is known as an area
that needs a lot of data.
And quantum computing is not exactly the platform where you want to feed terabytes of data into and out of. So how does that work? In what way is the problem formulated or simplified to allow quantum computers to add value to AI? input-output with a quantum computing system is well recognized in that it is not easy to load
data into a quantum computer. I should say it's not easy to load classical data into a quantum
computer. And especially for machine learning applications, artificial intelligence, where
you're doing training, oftentimes you need lots of these data pairs in order to perform that type
of training operation loaded into the
system. People are exploring this. They're trying to come up with ways of using autoencoders and
other types of methods for compressing that information and getting as much as possible
into those systems. But it does feel like there's a fundamental bottleneck in the conversion of
conventional classical bits, ones and zeros, into the quantum representation.
One alternative to that is create your information internal to the quantum system itself.
If I have a quantum computational model for some process, and it's the output of that
process that I am using for my AI training, then all of that can be internal to my quantum
computing system to start with.
And this is one way to try to reduce that IO bottleneck.
It does require me to have a model upon which I want to train things,
but that is actually a conventional challenge within the field of machine learning anyway.
So I think there's a lot of value in continuing to explore that area.
Some of it is crouched in the technology and the hardware bandwidth limits, but other parts of
it may come from novel algorithm development and how to extend those applications to this setting.
So final question, Travis, is how do people get engaged with Oak Ridge and what sort of
resources you have available that might be a good topic of collaboration?
We actually have several different points of contact that you can reach out to engage with
Oak Ridge National Laboratory as well as the Quantum Science Center.
Of course, we have a website set up for qscience.org.
You can go there to find out more about the Quantum Science Center and our mission as well as email and social media handles.
In addition, the Quantum Computing Institute has a website, quantum.ornl.gov, where you can find more about our mission as well as
our activities. I would also encourage people to look for the Quantum Computing User Program.
This is hosted at the Oak Ridge Leadership Computing Facility, and it's a great way to
test and evaluate current quantum computers. So please feel free to check out that. And then,
of course, always feel free to send me an email. I'm always looking forward to meeting new people,
learning about ways that we can partner together. Brilliant. Thank you so much.
Wonderful. Thank you as well. We've been with Travis Humble. Thanks so much for being with us.
Thank you. It's been a pleasure. That's it for this episode of the At HPC podcast.
Every episode is featured on InsideHPC.com and posted on OrionX.net. Use the comment section
or tweet us with any questions
or to propose topics of discussion. If you like the show, rate and review it on Apple Podcasts
or wherever you listen. The At HPC Podcast is a production of OrionX in association with Inside
HPC. Thank you for listening.