Quantum computers are like librarians: they both hate noise.
Compared to their classical counterparts, quantum computers are demanding and require a quiet environment to perform calculations quietly. But even the quietest space in the universe resonates with quantum noise: the inevitable motion of electrons and other atomic effects. If physicists could smooth out quantum errors caused by noise in a large enough quantum computer, do some calculationssuch as detailed simulations of molecules, which are intractable for classical computers.
While hardware improvements help, a key component is quantum error correction (QEC), a set of techniques to protect information from this quantum din. “We need our qubits to be nearly perfect, and we can’t do that with engineering alone,” says Google quantum computing researcher Michael Newman.
About supporting science journalism
If you like this article, please consider supporting our award-winning journalism subscribe. By purchasing a subscription, you’re helping to ensure a future of impactful stories about the discoveries and ideas that shape our world.
On Monday, Google published its latest research on error correction in the magazine nature and showed, for the first time, that errors can be eliminated exponentially as a quantum computer increases in size. “As you make the system bigger and bigger, you get better at correcting errors, but you’re also creating more errors,” says Daniel Gottesman, a quantum information theorist at the University of Maryland who was not involved in the research. “Once you get past this transition, where you can correct errors faster than they occur, making ever bigger systems makes them better.”
Google researchers created a 105-qubit silicon chip, quantum peers to classic pieces. They then linked multiple physical qubits together to form a conglomerate called a logical qubit. The logic qubit took twice as long as the individual qubits it was composed of, and had a 1,000 chance of error per computation cycle. (For comparison, the error rate on a typical classical computer is about one in 1,000,000,000,000,000,000, essentially zero).
The results were published for the first time preprint server arXiv.org in Augustbut today Google shared additional details about the technology that made the breakthrough possible: a new quantum processor called Willow (an upgrade from Sycamore before its tree name). “Really good qubits are the stuff that enables quantum error correction,” says Julian Kelly, director of quantum hardware at Google and a co-author of the new paper.
Google isn’t the only company that has taken steps to correct mistakes. In September a joint team of researchers from Microsoft and Quantinuum, a quantum computing company based in Broomfield, Colo., published the results on arXiv.org which showed that, using qubits made from laser-trapped ions, 12 logical qubits could be encoded with an error rate of 1,000.
Even with advances in error correction, practical applications for quantum computers are unlikely in the near term. Estimates vary, but the consensus among many researchers is that to solve useful algorithms or perform robust chemistry simulations, a quantum computer would need hundreds of logical qubits with error rates below a million.
All That Noise
Two main types of error plague quantum computers: bit-flips and dephasing. A bit flip, which also happens in classical computers, changes a qubit from 0 to 1, or vice versa. Pulling qubits out of their delicate quantum state is like taking a cake out of the oven before it’s ready. An error can ruin a calculation.
Classical error correction often preserves information through redundancy. If Alice wants to send the message “1” to Bob, she can send it in three copies, copying 1 twice to transmit “111”. In this way, even if it is reversed slightly—bringing it to “101”—Bob can still believe that Alice meant to send “1”. But copying information in this way is forbidden by the laws of quantum mechanics. So in the 1990s researchers had to develop error correction for quantum computers. “We need to spread the information so that there is redundancy but no duplication,” says Gottesman. By spreading information as logical qubits, it can be preserved even if a physical qubit is lost due to errors.
Researchers have been implementing codes that can detect and correct errors for decades, but until recently, there weren’t enough high-quality qubits available. Now the hardware has reached the point where it deserves impressive software. In 2022, Google used error correction in the Sycamore processor to lower the overall error rate. But the rate was still shy of a key threshold, so adding more physical qubits to a logical qubit resulted in lower yields. “As logical qubits get bigger, there’s more room for error,” says Newman, who co-authored the new study as well as a preprinted paper on the 2022 results.
The latest advance is largely thanks to Willow, who improves on Sycamore in three ways. First, Willow has more physical qubits, 105, compared to Sycamore’s 72. More physical qubits mean more logical qubits. “It’s not just the number of qubits,” says Kelly. “Everything has to be at the same time.” By improving manufacturing processes, Kelly and his colleagues improved the quality of the individual qubit: Willow’s qubits are more robust than Sycamore’s: they retain their delicate quantum state five times longer and have lower error rates.
To test the error correction, the Google researchers coded increasingly large logical qubits: first they consisted of a 3×3 grid of physical qubits, then a 5×5 grid, and finally a 7×7 grid. As the logic qubits grew, the error rate dropped significantly. “I saw these numbers, and I thought, ‘Oh my god, this is really going to work,'” Newman says.
A sense of scale
The experts were very impressed with the Google results. American scientific reviewed the peer review reports of four anonymous referees. “I think it’s a fantastic achievement that has really excited the community,” concluded one. Another agreed, writing that “this is one of the most important experimental quantum information results of the year (if not the decade).”
Graeme Smith, a quantum information researcher at the University of Waterloo in Ontario, is surprised by the result because it doesn’t cut it. “Focusing on error correction is the right thing to do,” he says. “It’s a real improvement.” Many previous error correction results relied on posterior selection, or the practice of discarding runs full of errors, to create an artificially lower error rate.
There are still notes to be made, even with Google’s results. Krysta Svore, a quantum computing researcher at Microsoft, noted that according to another metric, the error was not one in 1,000, but one in 100. it is important to increase performance as size increases. That’s the key thing that makes this scalable.”
Everyone seems to agree that recent advances in error correction are a sea change. “What’s absolutely exciting at the moment is the advancement of quantum error correction,” Svore says. For Gottesman and others who helped develop the theory behind error correction decades ago, the long wait is over. “It’s time to finally see these demonstrations of fault tolerance,” he says.
The advertisement There’s been a lot of buzz around quantum computers. In its most extreme form, it covers the claims that the devices will make wheree cancer or solving climate change – or that they’ve created a wormhole. Concerned researchers often lament that the hype will lead to unreasonably high expectations and may even lead to a “quantum winter” in which funding will dry up. Recent error correction results reveal another potential victim: truly spectacular advances—like this one—can be ignored.