This article is more than 1 year old

Boffins from China push quantum computing envelope for 'supremacy' in emerging photon field

Light-based quantum system bests classical supercomputer

Boffins from China say they have managed to detect as many as 76 photons using a quantum computer, a result said to be the second demonstration of "quantum supremacy" or "quantum primacy" – solving a problem that a quantum computer can do far better than a classical computer.

Physicists from Shanghai's University of Science and Technology of China (USTC), led by Chao-Yang Lu and Jian-Wei Pan, reported their results in a paper published Thursday in the journal Science.

The researchers conducted a Gaussian Boson Sampling test, a variation on the Boson Sampling technique proposed in 2010 by Scott Aaronson, professor of computer science at the University of Texas at Austin, and Alex Arkhipov, at the time a doctoral student.

The experiment involves sending photons through a beam splitting system and measuring their distribution – a task that a quantum computer turns out to be far better at doing than a traditional supercomputer.

The Chinese team developed an apparatus, dubbed Jiuzhang, that consists of a laser, mirrors, prisms, and photon detectors. The group reported that they achieved a sampling rate that's "~1014 faster than using the state-of-the-art simulation strategy and supercomputers."

robot

Intel Labs unleashes its boffins with tales of quantum computing, secure databases and the end of debugging

READ MORE

So there it is, further evidence that quantum computers really can do certain calculations better than classical computers. The result adds weight to Google's quantum supremacy experiment last year and also surpasses it: As Aaronson stated Thursday in a blog post, this is the first time the advantage of quantum computing has been demonstrated using photonics (light) rather than superconducting qubits.

But the USTC experiment represents only one step in a long journey toward practical quantum computing. The research, Aaronson observes, is not synonymous with a useful, universal, scaleable or fault-tolerant quantum computing – all milestones that have yet to be achieved.

As a gauge of the gap between quantum computing today and where it needs to go to become commercially viable, Anne Matsuura, director of quantum and molecular technologies at Intel, said in a presentation on Thursday that because effective error correction in quantum systems requires tens of qubits to pick one logical qubit, Intel believes "that a commercial scale system will really require millions of qubits."

Google's system topped out at 53 qubits. IBM managed a 65-qubit machine in September and promised a 1,000-qubit device by 2023.

Aaronson also touched on another challenge facing those developing quantum computing systems: Validating results using costly supercomputing power.

He was among the reviewers of the USTC paper and said he asked the researchers why they only bothered to validate the results of their experiment for up to 26-30 photons. Surely, he argued, they could verify to 40 or 50 using existing computers.

"A couple weeks later, the authors responded, saying that they’d now verified their results up to n=40, but it burned $400,000 worth of supercomputer time so they decided to stop there," he wrote.

And don't even think about trying to fully replicate the Jiuzhang results using a classical computer. The USTC paper estimates that the time cost for the TaihuLight (Fugaku) supercomputer to generate the same number of samples in 200 seconds as the Jiuzhang device would be 2.5bn years.

Among cloud service providers that bill for compute time, you can imagine the appeal of this sort of research. ®

More about

TIP US OFF

Send us news


Other stories you might like