This article is more than 1 year old

You're not wrong. The scope for quantum computers remains small

Perhaps a QC can help us work out why we'd want a QC

Systems Approach Back when I was field CTO for VMware in Asia-Pacific and Japan, many of my colleagues expected me to know something about everything in tech, and it was sometimes hard to bring myself to say "I don’t know."

And so in 2018 when someone asked me to explain quantum computing I gave it a shot and made a complete mess of the explanation. In fact, I made the most typical mistake of a dabbling generalist (often made in the popular science press) which was to say something about trying lots of solutions in parallel, leveraging the quantum property of being in a superposition of multiple states. If you know only one thing about quantum mechanics, it’s likely to be the thought experiment of Schrödinger’s cat, in which the cat is supposedly in two states (alive and dead) at the same time. Well, that turns out to be just enough to produce a pretty bad explanation of quantum computing.

Much of what I do understand about quantum computing comes from Scott Aaronson via his blog and lecture notes. Right at the top of his blog is the line burned into my memory since I first read it: “Quantum computers won't solve hard problems instantly by just trying all solutions in parallel.” This comic does a good job of debunking this line of thought as well.

Quantum supremacy either may or may not be a big deal

Anyway, after botching my first attempt to explain it, I decided to go a bit deeper on quantum computing, which ended up being quite timely. The first claim of “quantum supremacy” came out the next year and I immediately wanted to understand whether this was the big deal it seemed to be. (Like so much in this space, it either may or may not be a big deal.) One thing led to another and I’ve become just knowledgeable enough (or foolish enough) to attempt a couple of lectures on quantum computing and its practical impact. While it’s truly hard to get an intuitive feel for quantum computing, this talk represents my best effort at giving that intuition.

Last week I came across a post by Aaronson about his experience at the Solvay Conference on Physics, the topic this year being: “The Physics of Quantum Information.” Perhaps the most famous Solvay conference was the fifth, held in 1927, at which quantum theory was hashed out by an astonishing list of attendees including Einstein, Marie Curie, Bohr, Schrödinger and a dozen other Nobel prize winners.

The Solvay conference could be the greatest poster child for the importance of interdisciplinary research–in this year’s case, computer scientists exchanging ideas with particle physicists. There is still a huge amount of work to be done both in building practical quantum computers and in figuring out what they are actually good for, and broad expertise is needed to make progress.

But I was inspired to write this post by Aaronson’s suggestion that there is a “Law of Conservation of Weirdness.” It’s more of a hypothesis than a law, but it is at the heart of what many researchers are trying to understand about quantum computing: when does quantum computing provide significant (i.e., super-polynomial) speedup over classical algorithms? Scott’s conjecture is: there has to be something “weird” about the problem for a quantum algorithm to be effective. What remains unsolved to date is the precise nature of that weirdness. But typically there is some sort of structure to the problem that makes it suitable for quantum speedup.

For us non-physicists, the most famous (and potentially most practically important) problem that displays an appropriate level of “weirdness” to benefit from quantum algorithms is factorization. Finding the prime factors of an integer is not just a matter of randomly trying different answers until you stumble on one that works; there is plenty of structure to the problem that allows an efficient quantum algorithm to be found. This is what Shor’s algorithm does, and there is one part of Shor’s algorithm that happens to be really efficient for quantum computers while not being efficient on classical computers.

(It’s called the quantum fourier transform, and if you’d like a bit of intuition about how it works, Aaronson has you covered once again.)

There is no reason to panic since we have a while to wait before quantum computers are big enough and reliable enough to crack these algorithms, but the consensus is that we will need new algorithms eventually

Why does this problem, and Shor’s algorithm, matter? Well, much of cryptography depends on the supposed difficulty of finding the prime factors of a large number, and a variety of similar computationally hard tasks. RSA and related public key algorithms are believed to be at risk within a decade or two of becoming ineffective.

This is because the supposedly hard task of determining a private key given a public key depends on the hardness of factoring a large number (or some similarly expensive computation). If quantum computers continue to improve in terms of their number of quantum bits (qubits) and reliability as they have done in recent years, it seems only a matter of time before these algorithms will no longer be secure.

I would argue that there is no reason to panic since we have a while to wait before quantum computers are big enough and reliable enough to crack these algorithms, but the consensus is that we will need new algorithms eventually, and that by the time we know that the old algorithms have been broken, it will be too late. Hence, the wise choice is to plan ahead for “cryptographic agility”, i.e., the ability to swap out algorithms in favor of new ones that are not susceptible to quantum solutions.

NIST has been running a process for several years to identify suitable candidates for post-quantum cryptography (PQC).

What I find especially interesting about this situation is that the experts in this field are still figuring out which classes of problem are amenable to quantum solutions. This is the point of the “Law of Conservation of Weirdness.” Only a narrow subset of problems that are hard to solve classically are easy to solve with quantum algorithms. What we need for PQC is algorithms that don’t have efficient quantum (or classical) solutions. And while we can say “we haven’t found an efficient quantum solution” to a problem, it is harder to say that no such solution exists. If nothing else, this underscores the need to be agile in our choices of cryptographic algorithms going forward.

Finally, there seems to be a bit of “irrational exuberance” about the ability of quantum computers to solve all sorts of problems, in areas such as machine learning and financial markets. While it’s true that there is weirdness in both of those areas, I don’t think that’s what Aaronson means. My takeaway from his Solvay presentation [PPT] is that the set of problems with efficient quantum solutions remains small, even as the latest research seeks to determine just what makes a problem a good fit for quantum computing. ®

Larry Peterson and Bruce Davie are the authors of Computer Networks: A Systems Approach and the related Systems Approach series of books. All their content is open source and available on GitHub. You can find them on Twitter, their writings on Substack, and past The Register columns here.

More about

TIP US OFF

Send us news


Other stories you might like