Semiwiki 400x100 1 final
WP_Term Object
(
    [term_id] => 151
    [name] => General
    [slug] => general
    [term_group] => 0
    [term_taxonomy_id] => 151
    [taxonomy] => category
    [description] => 
    [parent] => 0
    [count] => 442
    [filter] => raw
    [cat_ID] => 151
    [category_count] => 442
    [category_description] => 
    [cat_name] => General
    [category_nicename] => general
    [category_parent] => 0
)

Contrarian Views on Quantum Computing

Contrarian Views on Quantum Computing
by Bernard Murphy on 01-23-2019 at 12:00 pm

I’m not opposed to the exotic ideas which capture public attention from time to time, but I do enjoy puncturing a popular expectation demanding every novel technology be a revolutionary breakthrough. I’ve already made my small contribution to driving a stake through the heart of claims that quantum computing (QC) will replace classical computing (it won’t) or that it will end security as we know it (unlikely). This still leaves a role for QC in some domains, but a couple of recent articles in Quanta magazine suggest even that muted expectation may be shaky.

22867-mind-blown-min.jpeg
The first blow came from a young (17) CS/Math major at UT who chopped the legs from under a QC algorithm for the recommendation problem. A paper published in 2016 had demonstrated a method that was exponentially faster than any known classical algorithm, raising hope that here, finally, was concrete proof that QC could do something practical far better than classical computing. But the UT teenager proved that some of the methods used in the QC approach could also be used in a classical approach, delivering almost comparable performance. Ouch. OK, one example doesn’t prove that QC isn’t useful; there can still be plenty of other applications where QC would tower above any possible classical algorithm. In fact another paper proves that (for another type of problem) QC can be demonstrably superior to classical computing. But it’s still kind of discouraging when one of the supposed star applications for QC turns out not to be a star at all.

The second possible blow suggests that effectively managing noise, at scale, through quantum error-correction (QEC) may be too expensive in practice or may even be theoretically impossible (though the author of this work acknowledges that his is a minority view). All systems are noisy and depend on the noise being managed down to an acceptable level. QC systems are particularly sensitive because they depend for their essential behavior on quantum coherence, which at the very simplest level means that a system can be in both of two possible quantum states simultaneously (think Schrödinger’s cat). Noise can break this coherence, collapsing the state into one or the other possibility, but no longer both. When this happens, QC becomes ineffective.

You can reduce thermal noise by dropping the QC to very low temperatures, which is why you always see these things submerged in liquid helium cryostats. Unfortunately, temperature isn’t the only source of noise. Quantum behavior is inherently noisy since it is probabilistic rather than deterministic; cooling doesn’t make this go away. There are yet more sources of noise but you get the point. So QC builders accept and adapt to the problem by adding QEC. Instead of using one physical qubit (quantum bit) per logical bit of information, they use multiple physical qubits per logical qubit, along with error-correction techniques between these multiple qubits. This might look like regular redundancy in safety-critical classical computing, but it’s a lot more tricky in QEC, since even looking at a qubit, much less correcting it, also breaks coherence.

I’ll skip the gory details, but two questions should be apparent at this point: how many physical qubits do I need per logical qubit, given realistic noise levels, and is this a fixed number or does it grow as the number of logical qubits grows? IBM announced a 50-qubit computer in 2017, which doesn’t sound like a lot, so I assume this does not factor in the redundant bits. One source estimates ~1,000 physical to logical qubits. Another credible source estimates that to apply Schor’s algorithm (a well-known QC algorithm) to factor a 2,000-bit number would require 10,000 logical qubits and 13,000 physical qubits per logical qubit. These are already daunting numbers and suggest a super-linear growth in physical to logical qubits, if you can trust the earlier estimate.

The contrarian (Gil Kalai) argues, and has proven for one class of computation, that this growth is unavoidably exponential. Even if he’s wrong in other cases and growth is slower but still faster than linear with logical qubits, this is still a problem given where we are today and expectations for QC. Which isn’t necessarily fatal since there seem to be many alternative implementations for physical qubits; possibly one or more of these might overcome this particular challenge. But it’s another potential setback and another cause for concern. The game is far from over, but it seems a little early to be breaking out the champagne.

Share this post via:

Comments

6 Replies to “Contrarian Views on Quantum Computing”

You must register or log in to view/post comments.