Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/even-if-quantum-computers-can-be-built-they-will-not-be-fast.10192/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Even if Quantum Computers can be Built, they will not be Fast

S

smeyer0028

Guest
John von Neumann in the 1950s already considered abstract models of computers in designing the von Neumann architecture that is used to build modern computers. The problem is that although Turing Machines are universal, they are slow because they lack random access and bit accessing "instructions." Here is an abstract along with references submitted to the British Society for the Philosophy of Science (BSPS) conference that describes von Nuemann's use of the MRAM model in designing modern computers. There is no class NP (problems that require guessing) in the MRAM model because computers can enumerate as fast (in the polynomial bounded sense) as Turing Machines can guess.

===========

John von Neumann's Natural Philosophy of Computing

John von Neumann's transformation from a mathematician in the 1920s to a natural philosopher in the 1950s is best expressed by a story Neumann told expressing Wolfgang Pauli's criticism of formal
mathematics. 'If a mathematical proof is what matters in physics, you would be a great physicist.' (Thirring[2001], p. 5). Neumann's transitioned from adherent of David Hilbert's programme that all knowledge could be axiomatized in the 1920s to a natural philosopher of computation in the 1950s. Kurt Godel's incompleteness results in the late 1920s and discussions with physicists in the 1930s were some factors that may have motivated Neumann's change. For example, in an exchange with Niels Bohr at the 1936 Warsaw New Theories in Physics Conference (IntCoop[1936]), Neumann agreed with Bohr that his Hilbert Space formalization is problematic.

Starting in the early 1940s, Neumann became focused on the possibilities of digital computers (See Aspray[1990] for a detailed history). Neumann did not explicitly discuss his philosophy of computation, but he did discuss his thinking about abstract properties that a computer architecture should have. He seems not to have considered Turing Machines (TM) at all because he had already rejected the Hilbert programme. His explicit rejection of formal neural networks also sheds light on his computational philosophy (Aspray[1990], note 94, p. 321).This paper provides an over view of Neumann's thinking by discussing a modern formalization of computation using an abstract model called MRAMs (random access machines with unbounded cell size and unit multiply) first studied by Hartmanis and Simon (Hartmanis[1974][1974a]). Neumann explicitly listed the requirements for his now almost universally used von Neumann computer architecture that match the properties of the MRAM model. In contrast to TMs that have an unbounded number of bounded size memory cells, Neumann assumed that a computer would have a finite number of unbounded (in the 19th century sense of infinity) size memory cells. Neumann also argued that some sort of intuition needed to be built into programs instead of brute force searching (Aspray[1990], p. 62). MRAM model allows random access table look up data structures (see Meyer[2016] for more details on Neumann's thinking).

There are modern consequences of Neumann's philosophy. In the MRAM model the P?=NP problem does not exist (or the answer is that there is no difference between the class of problems solvable by non deterministic guessing versus the class solvable by deterministic searching). Many modern philosophical questions assume implicitly that non deterministic TMs are more powerful than deterministic machines. For example, Shor's quantum computation algorithms assume the P?=NP problem (Shor[1997]). Scientists are seemingly building physical devices based on the rejected assumptions of the Hilbert programme.

An interesting story related to Neumann's philosophy is that advocates of the importance of the P?=NP problem (does guessing speed up computation by more than a polynomial bound) found a letter in the Godel Archive to Neumann that they interpret as Godel supporting the importance of the P?=NP problem. However, in a letter written probably in the 1940s by Neumann to Oswald Veblen relating to an Institute of Advanced Study appointment, Neumann shows skepticism toward Godel's later work (Hartmanis[1989] for the Godel P?=NP letter, Neumann[2005], p. 276 for the Neumann letter).

To me the most important consequence of Neumann's natural philosophy and computer architecture is that TMs are really very weak (slow) computing machines. It is true that TMs are universal in the Church Turing sense (Copeland[2015]). Any TM can calculate anything recursively computable in the Church Turing sense (for background the question 'are two regular expressions equivalent' is outside NP but computable). The thesis is often expressed as any TM can simulate any computing device. The problem is that modern thinking views computational efficiency as important. In that sense, TMs are slow. Another way of viewing the important of MRAMs and Neumann's philosophy is that they falsify (in the Popperian sense) the Church Turing thesis.

References:
Aspray[1990]
Aspray, W. John von Neumann and The Origins of Modern
Computing. MIT Press, 1990.

Copeland[2015]
Copeland, J. The Church-Turing thesis. The Stanford Encyclopedia
of Philosophy (Summer 2015 Edition), 2015. URL Feb. 2018:
plato.stanford.edu/archives/sum2015/entries/church-turing

Hartmanis[1974]
Hartmanis, J. and Simon, J. On the Structure of Feasible
Computations. Lecture Notes in Computer Science, Vol. 26. Also
Cornell Ecommons URL Jan. 2016:
On the Structure of Feasible Computations, 1974, 1-49.

Hartmanis[1974a]
Hartmanis, J. and Simon, J. The Power of Multiplication In
Random Access Machines. 15th IEEE Conference on Switching
and Automata. IEEE, Oct. 1974, 13-23.

Hartmanis[1989]
Hartmanis, J. Godel, von Neumann and the P=?NP Problem.
Periodic Structural Complexity Column. April 1989. Cornell
Ecommons URL Jan. 2016: eCommons - Cornell digital repository
Godel, von Neumann and the P=?NP Problem

IntCoop[1938]
International Institute of Intellectual Co-operation, New Theories
in Physics, conference proceedings, Warsaw, 1938.

Meyer[2016]
Meyer, S. Philosophical Solution to P=?NP: P is equal to NP,
ArXiv:1603:06018, 2016.

Neumann[2005]
Von Neumann, J. (Redei, M. ed.) John Von Neumann: Selected Letters.
History of Mathematics Series, Vol. 27, American Mathematical Society, 2005.

Shor[1997]
Shor, P. Polynomial-Time Algorithms for Prime Factorization and
Discrete Logarithms on a Quantum Computer. arXiv:quant-ph/9508027v2.

Thirring[2001]
Thirring, W. J. v. Neumann’s Influence in Mathematical Physics.
In Redei, M. and Stoltzner, M. (eds.) John von Neumann and the
Foundations of Quantum Physics. Vienna Circle Institute
Yearbook 8, Kluwer, 2001, 5-10.
 
Last edited by a moderator:
I must be dumb but I don't see the relation to quantum computing.
 
I must be dumb but I don't see the relation to quantum computing.

Let me try to answer. I am working on my IACAP paper (Internation Association of Computing and Philosophy conference). For the reasoning behind the claim that quantum computers are more powerful than Turing Machines, see Peter Shor's paper (it is on arXiv):

"Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer"

Key here is that von Neumann already understood that there are computers more powerful (faster in the polynomially bounded sense) than TMs. Shor in his description of real QC algorithms correctly sees that there can be machines faster than Turing Machines but does not understand that Neumann in his design of current digital computers understood the power of random access MRAM machines.

Shor first writes (see page 1 and 3):

====
"With the development of practical computers, it has become apparent that the distinction between computable and non-computable functions is much too coarse; computer scientists are now interested in the exact efficiency with which specific functions can be computed. This exact efficiency, on the other hand, is too precise a quantity to work with easily. The generally accepted compromise between coarseness and precision distinguishes efficiently and inefficiently computable functions by whether the length of the computation scales polynomially or superpolynomially with the input size. The class of problems which can be solved by algorithms having a number of steps polynomial in the input size is known as P."
===

Shor then writes. (my claim is that Shor does not understand that von Neumann who was an expert on quantum mechanices anticipated slowness of TMs, i.e. not only are quantum computers faster than TMs but so are von Neumann style MRAM machines so QCs are not faster).

=====
"Quantum mechanical objects often behave quite differently from how our intuition, based on classical mechanics, tells us they should. It thus seems plausible that the natural computing power of classical mechanics corresponds to Turing machines, while the natural computing power of quantum mechanics might be greater."

=====
 
Let me try to answer. I am working on my IACAP paper (Internation Association of Computing and Philosophy conference). For the reasoning behind the claim that quantum computers are more powerful than Turing Machines, see Peter Shor's paper (it is on arXiv):

"Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer"

Key here is that von Neumann already understood that there are computers more powerful (faster in the polynomially bounded sense) than TMs. Shor in his description of real QC algorithms correctly sees that there can be machines faster than Turing Machines but does not understand that Neumann in his design of current digital computers understood the power of random access MRAM machines.

Shor first writes (see page 1 and 3):

====
"With the development of practical computers, it has become apparent that the distinction between computable and non-computable functions is much too coarse; computer scientists are now interested in the exact efficiency with which specific functions can be computed. This exact efficiency, on the other hand, is too precise a quantity to work with easily. The generally accepted compromise between coarseness and precision distinguishes efficiently and inefficiently computable functions by whether the length of the computation scales polynomially or superpolynomially with the input size. The class of problems which can be solved by algorithms having a number of steps polynomial in the input size is known as P."
===

Shor then writes. (my claim is that Shor does not understand that von Neumann who was an expert on quantum mechanices anticipated slowness of TMs, i.e. not only are quantum computers faster than TMs but so are von Neumann style MRAM machines so QCs are not faster).

=====
"Quantum mechanical objects often behave quite differently from how our intuition, based on classical mechanics, tells us they should. It thus seems plausible that the natural computing power of classical mechanics corresponds to Turing machines, while the natural computing power of quantum mechanics might be greater."

=====

Can these MRAM computers actually be implemented in reality? What I understood is that the power of QC is in the coherence of the qubits. I think this means that for each added qubit in the coherent set you need to double the memory size of the MRAM to have the same 'efficiency' in computation. From a certain number of coherent bits it becomes impractical (or even impossible because of lack of atoms in the universe) to physically implement the MRAM.
 
I think I was not clear. John von Neumann was a quantum physics expert. When he
designed the von Neumann architecture he tried various computation models including
various parallel physical devices. The MRAM model was defined and analyzed by
"Hartmanis and Simon" in the 1970s as the best model of the von Neumann architecture.
The problem with quantum computers as currently defined is that they assume probability
can be used to guess solutions to problems faster than MRAM (random access and bit selecting)
searching. That is not true. An example is that in the philosophy of probability
literature there is the admission that probability contains a psychological component
(measurement problem in QM). Current computation methodology assumes P!=NP so
quantum computers can use simultaneous guessing to solve problems in NP in the
same polynomially bounded time as deterministic Turing Machines. I claim from reading
von Neumann's writings that he understood that his MRAM machine was as fast as guessing
machines in general.
 
I hope that it will be possible for you to make your IACAP paper available to us. If not please consider sending a copy to me by inmail.
 
Back
Top