If you have checked out any of my previous blogs on quantum computing (QC), you may think I am not a fan. That isn’t entirely correct. I’m not a fan of hyperbolic extrapolations of the potential, but there are some applications which are entirely sensible and, I think, promising. Unsurprisingly, these largely revolve around applying QC to study quantum problems. If you want to study systems of superpositions of quantum states, what better way to do that than to use a quantum computer?
The quantum mechanics you (may) remember from college works well for atomic hydrogen and for free particles used in experiments like those using Young’s slits. What they didn’t teach you in college is that anything more complex is fiendishly difficult. This is largely because these are many-body problems which can’t be solved exactly in classical mechanics either; quantum mechanics provides no free pass around this issue. In both cases, methods are needed to approximate; in the quantum case using techniques like the Born-Oppenheimer approximation to simplify the problem by effectively decoupling nuclear wave-functions from electron wave-functions.
As molecules grown in size techniques become progressively more sophisticated, one frontier of which today is represented by something called density functional theory with (for our domain) the confusing acronym DFT. Whatever method is used, all these techniques require a compounding pile of approximations, all well-justified, but leaving you wondering where you might be missing something important. Which is why quantum chemistry depends so heavily on experiment (spectroscopy) to provide the reality against which theories can be tested.
But what do the theorists do when the experimentalists tell them they got it wrong? Trial-and-error is too expensive and fitting the theory to the facts is unhelpful, so they need a better way to calculate. That’s where QC comes in. If you have a computer that can, by construction, accurately model superpositions of quantum states then you should (in principle) be able to model molecular quantum states and transitions.
The Department of Energy, which had long steered clear of the QC hype, started investing last year to accelerate development along these lines. They mention understanding the mechanism behind enzyme-based catalysis in nitrogen-fixing as one possible application. Modeling matter in neutron stars is another interesting possibility. Lawrence Berkeley Labs has received some of this funding to develop algorithms, compilers and other software, and novel quantum computers in support of this direction in analytical quantum chemistry.
Meanwhile, a chemistry group at the University of Chicago are aiming to better understand a phenomenon, grounded in the Pauli exclusion principle, in this case applying in >2 electron systems, which are known as generalized Pauli constraints. As a quick refresher, the Pauli exclusion principle says that two electrons(/fermions) cannot occupy exactly the same quantum state. The generalized constraints add further limits in systems with more than 2 electrons. The mechanics of this methods seem quite well established though far from universally proven, and the underlying physics is still in debate. Again, QC offers hope of better understanding that underlying physics.
A word of caution though. Modeling a system with N electrons will almost certainly require more than N qubits. Each electron has multiple degrees of freedom in the quantum world – the principal quantum number, angular momentum and spin angular momentum at minimum. And there’s some level of interaction of each of these with the nucleus. So a minimum of 6 qubits per electron, multiplied by however many qubits are needed to handle quantum error correction. Probably not going to be seeing any realistic QC calculations on proteins any time soon.
Share this post via:
Next Generation of Systems Design at Siemens