In an upcoming Innovation blog we’ll get into how quantum computers are programmed. Here I’d like to look more closely at algorithms beyond Grover and Shor, and what practical applications there might be for quantum computing. I also take a quick look at what analysts are saying about potential market size. Even more than in AI, this is a field where it can be difficult to separate promise from reality, especially when core concepts are so alien to conventional compute ideas. I found my research on the topic helped me develop a somewhat clearer view.

Algorithms
First a nod to Grover’s and Shor’s algorithms, the best-known in this area. Grover’s algorithm searches a list for a candidate that best meets some objective, say largest number in a list. For an unsorted list, Grover’s method is quadratically faster than classical searches. Shor’s algorithm factorizes large integers and is exponentially faster than the best classical algorithms.
The Quantum Fourier Transform underlies the Shor algorithm and is conceptually similar to the classical discrete Fourier transform.
Don’t get carried away by exponential speedups. Many algorithms are no faster or only polynomially faster when run on quantum. I may get more into that in another blog in this series.
Quantum Phase Estimation is the foundation for eigenvalue (allowed state energies) estimation in quantum chemistry. The equivalent in classical computing would be a numerical solution to a differential calculus problem. Variational Quantum Eigensolvers (VQE) further extend these ideas to find ground state (minimum) energies in quantum chemistry and materials science. Quantum Approximate Optimization Algorithms (QAOA) are useful for combinatorial optimization problems such as the traveling salesman problem or selecting a subset of discrete options from a larger set as an input to drive classical optimization algorithms.
Applications
I can’t find any claim of production applications today. Looking forward, there are a few important considerations to factor into when we might see such applications. First, there is a clear trend to algorithms which rely on a hybrid of classical and quantum computation, where algorithms intentionally switch back and forth between classical and quantum stages. Classical is used for parts of an algorithm where no speedup is required and quantum is used for core components of the algorithm where speedup is very much required.
Second, there is a useful classification representing quantum compute today as an era of Noisy Intermediate Scale Quantum (NISQ) – 10-100+ qubits, allowing only for short coherence times. This era is expected to run through ~2030. The subsequent era of Fault Tolerant Quantum Computing (FTQC) should allow for 1000s of ideal qubits (with quantum error correction) and hours of hybrid computation time.
Third, quantum problems in chemistry, physics, and materials science are in many ways the easiest fit for this kind of computation and are therefore areas we might see fastest progress, though applications in high demand non-science domains could also be contenders.
Many claims seem to depend on FTQC, however a few that might be attainable in the NISQ era. Fourier transform algorithms on conventional computers are used extensively today in X-ray crystallography as guidance to materials and biotech molecular structure analysis. Such analysis does not scale in complexity as fast as direct molecular modeling, which is currently limited to small molecules.
An urgent demand that might accelerate development is quantum sensing against terrestrial magnetic field maps as a hack-proof alternative to GPS. There are also arguments for use in finance for portfolio design and credit scoring. Check out this video, starting at about 23:30 for a discussion of a portfolio optimization application.
Later applications are more ambitious, including a new approach to generating ammonia for agricultural fertilizer, improving battery design, better understanding of how drugs are metabolized in the body, and improving understanding of (nuclear) fusion reactions.
Analyst views on market size
McKinsey in 2024 projected that the total quantum technologies market (quantum computing, quantum communication and quantum sensing) could grow up to $97B by 2035, with quantum computing contributing between $28B and $72B of that number. They also project the quantum technologies market could be as large as $200B by 2040.
These are wide ranges and long time-horizons, not encouraging for early investors. BCG says that opportunities in the NISQ era are not panning out as well as they projected in their 2021 report, though they still see active growth from 2030 to 2040 and major growth beyond that point. Meantime they forecast a market for tech providers valued between $1B and $2B by 2030, the bulk of that coming from public sector and corporate investment.
My view for what it’s worth started as “never going to get out of university labs”, to thinking I might be completely wrong given the scale of public and private investment, now to somewhere in between. Finance and security/safety critical drivers like an alternative to GPS, together with technology advances in quantum error correction and hybrid flows, might accelerate progress at least in some applications, possibly within the NISQ era.
Failing dramatic breakthroughs, I am sure quantum accelerators will become important eventually, perhaps by 2040, though it is interesting that claims of supremacy often seem to supercharge algorithm advances on classical computers! Benefiting all markets, though not so good for quantum.
Also Read:
Superhuman AI for Design Verification, Delivered at Scale
AI Deployment Trends Outside Electronic Design
Share this post via:

The Quantum Threat: Why Industrial Control Systems Must Be Ready and How PQShield Is Leading the Defense