Algorithmic warfare: DARPA tests quantum computing capabilities
A GTRI researcher works on a quantum computer.
Sean McNeil/GTRI Photo
The Defense Advanced Research Projects Agency recently funded the second phase of a quantum computing project that aims to expand the usefulness of emerging technologies, according to one of the project’s lead researchers.
The second phase of the Georgia Tech Research Institute-led project received $9.2 million in funding to allow scientists to conduct additional experiments on a quantum computing system configured to potentially string together more computational units than ever before.
The DARPA project – Optimization with Noisy Intermediate-Scale Quantum devices – aims to “demonstrate the quantitative advantage of quantum information processing by exceeding the performance of purely classical systems to solve optimization problems”.
Researcher Creston Herold said that one of the classic optimization problems that quantum computing systems could solve is called the traveling salesman.
“A famous problem is this traveling salesman problem, where you have a list of addresses that you need to pick up a path and packages for delivery, for example,” he said. “And you want to find the most efficient route, whether in time or distance traveled, or the fewest left turns made, or at least the fuel used.”
This type of problem appears in a wide variety of logistical problems in defense and other government affairs, he noted.
Quantum computers use basic units called qubits rather than 1s and 0s like traditional computers. Its computing power derives from the ability for each qubit to be both 1 and 0 simultaneously, rather than being limited to one or the other. As a result, a quantum computer could run more complicated algorithms and work much faster than a traditional computer.
This research aims to go beyond most advances in quantum computing made so far, Herold explained. Quantum computers exist today, but they are as large as the first traditional computers and have not yet developed the computing power to compete with their conventional counterparts.
While most quantum computing systems use magnetic traps to isolate ions, one of the team’s researchers, Brian McMahon, developed a “unique” configuration optimized for a more efficient process.
The trapping process – called Penning’s trap – uses a combination of a magnetic field and an electric field to confine two-dimensional ion crystals that perform quantum operations.
“The use of rare earths is actually in the permanent magnets, which form the trap,” Herold said. “There are magnets like neodymium or samarium cobalt. They are very, very strong magnets.
The trap uses these rare-earth metals in place of “bulky, cryo-cooled superconducting magnets,” according to the team.
The team has already completed 18 months of trials and experiments. During this time, the researchers constructed an ion chain 10 qubits in length. A qubit is one of the smallest units in a quantum computing system.
Herold said building the basics of research with the short string is a start for research, but ultimately it will go much further.
“It was really about testing the control scheme and showing that this way of operating the device would solve these problems as expected,” he said.
Adding thousands more quantum systems to the chain would allow the computer to calculate more accurate solutions, Herold said. Without adding many more systems, the quantum computer would have about the same power as a classical machine, he said.
“At the start of the project, we knew we would need hundreds of qubits to really push forward solving an important problem,” he said. “We can still simulate everything that happens on a quantum device, and it’s just too small to attack a big enough optimization problem that we don’t already know the answer easily.”
But that doesn’t mean traditional computing isn’t playing a role in the project. The researchers use classical computing hardware to guide the quantum hardware to a better starting point, so the system doesn’t have to check every possible solution.
“The classical nature of it is that we’re using a classical process to sort of monitor quantum hardware and decide what to do next,” Herold said.
As promising as the project has proven so far, researchers still face daunting technical challenges. For example, the more complex the quantum system becomes, the more likely it is to have a large error rate caused by “noise” – a term meaning interference with the state of qubits in the quantum computer.
The research team includes scientists at Oak Ridge National Laboratory, who use a supercomputer there to map the best path to minimize noise in the quantum system as it scales.
“With quantum hardware, we’re always struggling with noise, and at some point there will be too many errors that we can’t really scale the hardware,” Herold said.
While part of the research is figuring out how to mitigate errors, the amount of noise will eventually limit the number of qubits in the chain and therefore the complexity of the system, he explained.
However, if researchers can find solutions to these challenges for experiments, the results will be meaningful across industries, Herold said.
“This project will show that larger collections of qubits can solve optimization problems and in a better way than we currently know, and it would have a truly transformative impact on how these problems are solved.” he declared.