“I think they’re right that if they had access to a big enough supercomputer, they could have simulated the … task in seconds,” says Scott Aaronson, a computer scientist at the University of Texas at Austin. The advance takes some of the shine off Google’s claim, says Greg Kuperberg, a mathematician at the University of California, Davis. “Getting 300 feet from the top is less exciting than getting to the top.” But the promise of quantum computing remains undiminished, say Kuperberg and others. And Sergio Boixo, principal scientist for Google Quantum AI, said in an email that the Google team knew its advantage might not last long. “In our 2019 paper, we said that classical algorithms will be improved,” he said. But, “we don’t think this classical approach can keep up with quantum circuits in 2022 and beyond.” The “problem” Sycamore solved was designed to be difficult for a conventional computer, but as easy as possible for a quantum computer, which handles qubits that can be set to 0, 1, or—thanks to quantum mechanics—any combination 0 and 1 at the same time. Together, Sycamore’s 53 qubits, tiny electrical resonance circuits made of superconducting metal, can encode any number from 0 to 253 (about 9 quadrillion)—or even all of them at once. Starting with all qubits set to 0, the Google researchers applied a random but fixed set of logic operations, or gates, to individual qubits and pairs over 20 cycles, and then read the qubits. Roughly speaking, quantum waves representing all possible outputs fell on the qubits, and the gates created interference that amplified some outputs and canceled others. So some should have appeared more likely than others. Over millions of trials, a sharp output pattern emerged. The Google researchers claimed that simulating these interference phenomena would overwhelm even Summit, a supercomputer at Oak Ridge National Laboratory, which has 9,216 central processing units and 27,648 faster graphics processing units (GPUs). Researchers at IBM, which developed Summit, quickly countered that if they took advantage of every bit of hard drive available on the computer, they could handle the calculation in days. Now, Pan Zhang, a statistical physicist at the Institute of Theoretical Physics of the Chinese Academy of Sciences, and his colleagues have shown how to beat Sycamore in a paper published in Physical Review Letters. Following others, Zhang and his colleagues reformulated the problem as a three-dimensional mathematical arrangement called a tensor network. It consisted of 20 layers, one for each gate cycle, with each layer containing 53 dots, one for each qubit. Lines connected the dots to represent gates, with each gate encoded in a tensor—a 2D or 4D grid of complex numbers. The simulation run then reduced to essentially multiplying all the tensors. “The advantage of the tensor network method is that we can use multiple GPUs to do the calculations in parallel,” says Zhang. Zhang and his colleagues also relied on a key idea: Sycamore’s calculation was far from accurate, so theirs didn’t have to be either. Sycamore calculated the output distribution with an estimated accuracy of 0.2% — enough to distinguish fingerprint-like sharpness from noise in the circuit. So Zhang’s team traded accuracy for speed by cutting some lines in their network and eliminating the corresponding gates. Losing just eight lines made the calculation 256 times faster while maintaining 0.37% fidelity. The researchers calculated the output pattern for 1 million of the 9.4 billion possible number strings, relying on an innovation of their own to obtain a truly random, representative set. The calculation took 15 hours on 512 GPUs and yielded the indicative sharp output. “It’s fair to say that Google’s experiment has been simulated on a conventional computer,” says Dominik Hangleiter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the calculation would take a few tens of seconds, Zhang says—10 billion times faster than the Google team estimated. The advance highlights the pitfalls of racing a quantum computer against a conventional one, the researchers say. “There is an urgent need for better quantum supremacy experiments,” says Aaronson. Zhang suggests a more practical approach: “We should find some real-world applications to demonstrate the quantum advantage.” But Google’s demonstration wasn’t just hype, the researchers say. Sycamore required far fewer functions and less power than a supercomputer, Zhang notes. And if Sycamore had slightly higher fidelity, he says, his team’s simulation wouldn’t have been able to continue. As Hangleiter puts it, “The Google experiment did what it was supposed to do, start that race.”