A simple computer could beat Google’s quantum computer Science

If the quantum computing era started 3 years ago, its rising sun might sink behind the clouds. In 2019, Google researchers claimed they had passed a milestone known as quantum supremacy when their quantum computer Sycamore performed an abstract calculation in 200 seconds that they said would tie up a supercomputer for 10,000 years. Now, scientists in China have calculated in a few hours with a normal processor. A supercomputer, they say, can outright beat a sycamore.

“I think they’re right that if they had access to a big enough supercomputer, they could simulate the task … in a matter of seconds,” says Scott Aaronson, a computer scientist at the University of Texas at Austin. Greg Kuperberg, a mathematician at the University of California, Davis, says the advance gives Google’s claim some gloss. “Getting 300 feet from the summit is less exciting than reaching the summit.”

Still, the promise of quantum computing remains unclear, Kuperberg and others said. And Sergio Boixo, lead scientist for Google Quantum AI, said in an email that the Google team knew its edge wouldn’t last much longer. “In our 2019 paper, we said that classical algorithms will be improved,” he said. But, “we don’t think this classical approach can keep up with quantum circuits in 2022 and beyond.”

Sycamore’s solution to the “problem” was designed to be difficult for a conventional computer but as easy as possible for a quantum computer, which manipulates qubits that can be set to 0, 1, or – any combination of 0 and 1. at the same time. Together, Sycamore’s 53 qubits, tiny resonating electrical circuits made of superconducting metal, can encode any number from 0 to 2.53 (about 9 quadrillion)—or all of them at once.

Starting with all qubits set to 0, the Google researchers applied them to single qubits and over 20 cycles of logical operations, or random but fixed sets of pairs of gates, then read the qubits. Strictly speaking, quantum waves representing all possible outputs collapsed between qubits, and the gates created interference that reinforced some outputs and canceled others. So some should appear with greater probability than others. Over millions of trials, a spike output pattern emerged.

Google researchers argued that simulating those interference effects would require Summit, a supercomputer at Oak Ridge National Laboratory, which has 9,216 central processing units and 27,648 fast graphics processing units (GPUs). Researchers with IBM, which developed Summit, quickly countered that if they exploited every bit of available hard drive in a computer, it could handle the calculation in a few days. Now, statistical physicist Pan Zhang of the Chinese Academy of Sciences’ Institute of Theoretical Physics and colleagues have shown in a paper in press how the sycamore can be beaten. Physical review papers.

Following others, Zhang and colleagues formulated the problem as a 3D mathematical array called a tensor network. It had 20 layers, one for each cycle of gates, with 53 dots on each layer, one for each qubit. Lines connect dots to represent gates, each gate encoded in a tensor – a 2D or 4D grid of complex numbers. When running a simulation, essentially, all tensors are reduced to multiplication. “The advantage of the tensor network method is that we can use multiple GPUs to compute in parallel,” says Zhang.

Zhang and colleagues also relied on a key insight: Sycamore’s calculations weren’t precise, so theirs didn’t need to be either. Sycamore calculated the distribution of outputs with an estimated accuracy of 0.2%—enough to distinguish fingerprint-like spiciness from noise in the circuitry. So Zhang’s team traded accuracy for speed by cutting some lines in their network and removing the associated gates. Losing only eight lines made the calculation 256 times faster while maintaining an fidelity of 0.37%.

The researchers calculated the output pattern for 1 million of the 9 quadrillion possible number strings, relying on their own innovation to obtain a truly random, representative set. Computations on 512 GPUs took 15 hours and provided telltale spiky output. “It’s fair to say that the Google experiment has been replicated on a conventional computer,” says Dominic Hangliter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the calculation takes a few tens of seconds, Zhang says — 10 billion times faster than the Google team estimated.

The advance underscores the dangers of racing quantum computers against conventional computers, the researchers say. “There is an urgent need for better quantum supremacy experiments,” says Aronson. Zhang suggests a more practical approach: “We have to find some real-world applications to demonstrate the quantum advantage.”

Still, the Google performance wasn’t just hype, the researchers say. Sycamore requires far fewer operations and less power than a supercomputer, Zhang notes. And if Sycamore had a little higher fidelity, he says, his team’s simulations could have been maintained. As Hangliter says, “That’s what the Google experiment was meant to do, start this race.”

Leave a Comment

Your email address will not be published.