I’ve written recently about the trend to move more technology to the edge, to mobile devices certainly but also to IoT edge nodes. This is based particularly on latency, communications access and power considerations. One example is the move of deep reasoning apps to the edge to handle local image and voice recognition which would be completely impractical if recognition required a round trip to the cloud and back.

Another example concerns quantum computing and its potential to undermine cryptography, so that anything you think is secure (on-line shopping/banking, personal medical information, the national power grid, national security) will be easily accessed by anyone who can afford a quantum computer (nation states and criminal enterprises). If this is possible at the desktop/cloud level, it should be even more of a risk in mobile and IoT devices.

Conventional cryptographic methods rely on the significant difficulty of solving a mathematical problem, such as factoring an integer computed as the product of two large prime numbers. While the complexity of these problems can be arbitrarily high, a combination of clever mathematics and distributing the solver over massive networks of PCs has broken some impressively large cryptography keys. Cryptographers have been able to crank up the size of the key to stay ahead of the crackers, but quantum computing threatens to break though even that line of defense. (Other techniques based on elliptic curves and discrete logarithms are similarly limited.)

The mathematics of determining how hard it can be to crack an algorithm are challenging and in practice lead to upper bounds based on best-known cracking algorithms, those bounds being progressively refined over time as improved methods are discovered. The best-known solution to the general factorization problem has a complexity (in terms of time taken to solve the problem for a given key-size) which grows at a rate slightly slower than exponential with key-size.

Enter quantum computing (QC). Skipping the gory details, the point about QC is that for a give number of quantum bits, it can evaluate for all possible settings of those bits at the same time. So if the QC had N bits, it could evaluate all 2[SUP]N[/SUP] possibilities in parallel. This makes QCs able to solve problems of exponential complexity in reasonable time. Certainly, the integer factorization problem (widely used in production cryptography today) would be completely exposed to machines with a large enough number of quantum bits.

So, RIP cryptography? Not so fast. I wrote about a year ago on lattice-based cryptography methods, specifically designed to be hard for QCs to crack. That’s the thing about math – you invent a way to crack a class of problems, then the mathematicians come up with new algorithm which defeats your invention. Support for this method was added to OpenSSL back in 2014, though this seems to have eluded many people who write on QC threats to encryption.

But while that solution is good for desktops and the cloud, it’s more problematic for mobile and edge devices which are much more power- and latency-sensitive (and therefore, as mentioned earlier, prefer to avoid or minimize tasks requiring cloud communication). To address this need, SecureRF now offers a solution using a Diffie-Hellman-like authentication protocol with a 128-bit key, but resistant to known QC attacks; it is also 60x faster than elliptic curve cryptography and uses 140x less energy. The digital signature algorithm is based on group-theoretic methods (built on braid groups), known to be quantum-resistant. Arrow’s Chameleon96 Community board hosts this solution today.

A word on quantum-resistance. Complexity bounds at these lofty heights are difficult to find and prove. What is known is that QC algorithms, like Schor’s and Grover’s algorithms which can crack non-quantum encryptors, can be defeated by resistant algorithms. Also, all quantum-resistant approaches I have seen are at least NP-hard, which means they are expected to be very hard to solve. That doesn’t prove they can’t be cracked by some QC method but no-one knows of such a method and QC has very little wiggle room. If a resistant algorithm is even a little bit super-exponential in complexity, then the effort to crack using QC increases faster than linearly as key-size increases, taking us back to where we started.

One last technology note for QC key-cracking enthusiasts. Commercial QC doesn’t yet exist beyond relatively small word sizes (e.g. at Google) though IBM and others claim they will release systems within the reasonably near future. Then again, the NSA is known to be working on QC and China has already launched a satellite to support quantum key exchange over long distances. Different technology, but it’s not a big stretch to assume they’re also working on QC. And we must assume Russia is doing the same. So yeah, you should probably assume that at least nation-state hackers will be able to crack your non-quantum-resistant cryptography today or very soon and therefore all responsible systems should plan to be resistant to quantum attacks. Solutions from vendors like SecureRF will necessarily be required in any system used in commerce, automotive, medical and other secure applications.

You can read the SecureRF press release HERE and get more information on their products HERE.

## 0 Replies to “Quantum Resistance on the Edge”

You must register or log in to view/post comments.