Global advantage through high-performance quantum computing
Quantum computers have the potential to provide a game-changing boost for science. For maximum benefit to society, quantum computers will need to merge with traditional high-performance computing. Here, international collaboration is of key importance for bringing the required expertise and know-how together.
From idea to advantage
It took forty years for quantum computers to mature from idea to the first commercial products. The decades of basic research that has led to this point is marked by international collaboration across nations and continents. It is crucial to recognise that despite soaring interest and investments from the commercial sector, quantum computing is still in its infancy. In order to become a disruptive technology, years of sustained basic research still lies ahead.
The projected superpower of quantum computers for certain computational tasks is something that has to be taken seriously. One aspect is the threat that sufficiently mature quantum computers pose to digital encryption. However, before actually being able to break the internet, the infamous Shor’s algorithm requires significantly more power and oomph than near-term quantum computers can provide. A much more intriguing outlook is the capacity for quantum computing to boost high-performance computing (HPC), scientific modelling, and R&D in general.
Traditional supercomputers are immensely powerful. The advance in supercomputing has been so steady, that one easily gets blinded by the progress. Year 2022 witnessed the leap to exascale computing, with the first system, Frontier, breaking the barrier of performing 1018 floating-point operations per second, a near-unfathomable number. This is also one million times the capacity of the fastest supercomputer 25 years ago; problems a million times more complex can now be solved. In the understated manner of Mr. Spock: Fascinating. While 25 years might initially sound like prehistory, it actually is not, at least not for those of us who were old enough to go to the movies back then. Titles like The Fifth Element, Men in Black, and, of course, Titanic are not that ancient. Still, despite all future progress in classical processing power, there is a range of computational problems that will forever stay intractable for supercomputers, no matter how many 25-year periods pass. Unless they merge with quantum computers, that is.
Quantum-accelerated HPC can unlock simulations of revolutionary complexity. The delicate details of the electrons performing their ballet in molecules and materials is a perfect example of a highly complex modelling problem at the sub-microscopic scale. A longer length-scale case would be choreographing the marching-band of electrons along power grids spanning entire continents, aiming for maximum fault-tolerance.
Quantum computers provide the key due to the incredible power contained in their quantum bits, qubits, the basic quantum information unit. Each additional perfect qubit doubles the power of a quantum computer. To increase the problem size by that factor of a million, in the quantum world, we just need a modestly-sounding twenty additional ideal qubits. The hope then is, that each addition of twenty perfect qubits would not require 25 years of development.
Why HPC then?
Quantum computers are not a substitute for binary computers. They excel at some tasks, but perform poorly for many others. It is by identifying time-consuming tasks that are suited for quantum computing, and injecting them into HPC workflows, that grand-challenge problems can be tackled. As a prime example, digital twins, replicas of Earth, are now being conceived on the most powerful supercomputers. Present plans include creating digital counterparts of, e.g., the weather system, oceans, and biodiversity, in order to guide human behaviour. Ultimately, the goal is to combine the different models into one large interconnected system, dramatically increasing complexity. For this, merging quantum computing with HPC can be decisive.
Green quantum advantage
Exponential quantum speed-up is still, admittedly, at least a decade off. In the near-term, another type of quantum advantage is even more relevant, however. For a given problem, quantum computers use a mere fraction of the energy compared to classical devices. Quantum computers do not have to provide answers faster than classical computers: getting the answers using less energy will have an immediate, positive global impact. The most power-hungry applications running on supercomputers are those related to electronic structure theory (think more efficient batteries and pharmaceuticals) and artificial intelligence. Coincidentally, these are also problems that are highly suitable for quantum computers!
Advantage through open science
Quantum computing can contribute to society by providing the tools to perform computational experiments on an unprecedented scale and by providing predictions of highly increased accuracy to guide fact-based decision making. Quantum advantage, measured either by computational power or energy efficiency, will change the way we utilise supercomputing. The first advantages will be incremental, providing a slight improvement over purely classical information processing. We are steadily edging towards this point. When transforming quantum computing to a mind-bogglingly powerful scientific instrument, slight advantage is only the beginning, however. Just the first victory in a junior-league football match: nice, but the big money lies much further ahead. The impatient pre-schooler inside us needs to be told that: No, we are not there yet!
Now, the risk is that the first signs of quantum advantage will trigger an avalanche of clamming up on cooperation in the field. The first indications of technology protectionism are already in the air. At a too early stage, this would inevitably delay the onset of true, revolutionary quantum advantage, vital to society. In the worst case, advantage is not only delayed, but completely lost, due to stagnation of development due to too strict barriers to collaboration. Only by pooling resources across the globe, can we find solutions to the most pressing problems and upcoming obstacles. This requires that we as a global community share our know-how as freely and openly as possible. Each year that we can bring the quantum revolution in computing closer, adds a year’s worth of science and development utilising quantum-accelerated supercomputing as an aid to finding solution to some of the grand challenges that humanity and our planet are facing.
HPC centres provide a round table for discussion
Collaboration is a basic activity of most HPC centres. The scale ranges from collaboration with individual customers in order to eke out the most of the services we provide, to international partnerships for setting up powerful cross-border e-infrastructures. For example, the LUMI supercomputer is the most powerful in Europe thanks to a truly pan-European joint effort, and the whole really is significantly greater than its parts. HPC centres across the world are therefore natural hubs for cooperation also when it comes to making quantum computing impactful. The know-how for setting up a complete, functioning, and relevant computing infrastructure of the future is dispersed across the globe, across disciplines, and across communities. We all need to sit down and discuss how to properly implement the age of quantum computing, in order to get it right. Continued dedication to open science and open-source development is a crucial component on the road towards global quantum advantage. HPC centres can foster this development. Join us for the discussion!
This is the extended edition of the piece originally published as part of the techUK Future of Compute Week, 1 Dec 2022.
Main image: Image dreamt up by the DALL-E 2 AI engine
Author: Mikael Johansson, Quantum Strategist, CSC