Error mitigation is the path to valuable quantum computing


Businessman holding Quantum computing concept with qubit icon 3d rendering
Image: Production Perig/Adobe Stock

Although quantum computers have seen tremendous improvements in their scale, quality and speed in recent years, quantum error mitigation, the continuous path from today’s quantum hardware to future fault-tolerant quantum computers “seems to be missing from the narrative,’’ according to IBM.

In newly-released research, the company notes that the first step and the ultimate goal is to build a large fault-tolerant quantum processor “before any of the quantum algorithms with proven super-polynomial speed-up can be implemented.”

Recent advances in techniques that are broadly referred to as quantum error mitigation allow for a smoother path toward this goal.

SEE: Quantum computing: A cheat sheet (TechRepublic)

“Along this path, advances in qubit coherence, gate fidelities and speed immediately translate to measurable advantage in computation, akin to the steady progress historically observed with classical computers,’’ the company wrote in a blog on Tuesday. “The ultimate litmus test for practical quantum computing is to provide an advantage over classical approaches for a useful problem. Such an advantage can take many forms, the most prominent one being a substantial improvement of a quantum algorithm’s runtime over the best classical approaches.”

This will require the algorithm to have an efficient representation as quantum circuits, which requires addressing two questions. The first is figuring out which problems can be mapped to quantum circuits that have solutions that are better than classical approaches.

The second is to determine how reliable outcomes can be obtained for these circuits on quantum hardware with a faster runtime.

To address the first question, IBM said it is working with the community and industry experts to find problems that are solvable with quantum circuits that are known to be difficult to simulate classically. Already, it is doing so through its IBM Quantum Network, made up of Fortune 500 companies, academic institutions, national entities, start-ups and the Qiskit community to explore the problem space of quantum circuits to drive real application and value.

To address the second question in practice is more of a challenge. Current quantum hardware is subject to different sources of noise, the most well-known being qubit decoherence, individual gate errors and measurement errors, according to IBM.

“These errors limit the depth of the quantum circuit that we can implement,” IBM said. “However, even for shallow circuits, noise can lead to faulty estimates. Fortunately, quantum error mitigation provides a collection of tools and methods that allow us to evaluate accurate expectation values from noisy, shallow depth quantum circuits, even before the introduction of fault tolerance.”

IBM’s recommended tools and techniques

Error Mitigation

Probabilistic error cancellation is a “secret sauce technique” being used to effectively invert noisy circuits to get error-free results, even though the circuits themselves are noisy.

Scale

In 2021, IBM unveiled the 127-qubit Eagle processor, the first quantum processor capable of quantum circuits that can’t be simulated classically.

Detailed in the expanded quantum roadmap unveiled in May, the number of qubits within its systems is on track to reach 4,000+ in 2023. The milestones that have been mapped to increase the power, quality and accessibility of quantum hardware and software will serve as the foundation for quantum advantage.

Hardware improvements

Even after larger processors are unveiled, the company is continuing to improve their performance at its research headquarters in Yorktown Heights, NY.

One of these improvements is in the coherence of the qubits. IBM said it has more than doubled the coherence times on its 65-qubit chips since they were unveiled in 2020, and every improvement further reduces the errors in the quantum circuits.

These factors work together exponentially, magnifying each other’s effects.

“Taken together, all of the above means ever-larger quantum computers with ever lower error rates,’’ IBM wrote. “And this puts us on a trajectory where we can deliver quantum computers that can out-compete classical computers — carrying out calculations faster, better and more efficiently.”

Processor improvements

The company also noted that these ideas go beyond theory and officials have already started to demonstrate the efficacy of error mitigation on large processors.

The path to quantum advantage will be driven by improvements in the quality and speed of quantum systems as their scale grows to tackle increasingly complex circuits, IBM said. Already, it has introduced a metric to quantify the speed of quantum systems — CLOPs — and demonstrated a 120x reduction in the runtime of a molecular simulation.

“The coherence times of our transmon qubits exceeded 1 ms, an incredible milestone for superconducting qubit technology,’’ IBM said. “Since then, these improvements have extended to our largest processors, and our 65-qubit Hummingbird processors have seen a 2-3x improvement in coherence, which further enables higher fidelity gates.”

In its latest Falcon r10 processor, IBM Prague, two-qubit gate errors dipped under 0.1%, which IBM said was “yet another first for superconducting quantum technology, allowing this processor to demonstrate two steps in Quantum Volume of 256 and 512.”



Source link