The Argument against Quantum Computers – a CERN Colloquium and a New Paper

Let me announce my CERN colloquium this Thursday, August 22, 2019, 16:30-17:30 entitled “The argument against quantum computers.” If you are at CERN or the neighborhood, please please come to the lecture. (Tea and coffee will be served at 16:00. ) If you are further away, there is a live broadcast.

A few weeks ago I uploaded to the arXive a new paper with the same title “The argument against quantum computers“. The paper will appear in the volume: Quantum, Probability, Logic: Itamar Pitowsky’s Work and Influence, Springer, Nature (2019), edited by Meir Hemmo and Orly Shenker. A short abstract for the lecture and the paper is:

We give a computational complexity argument against the feasibility of quantum computers. We identify a very low complexity class of probability distributions described by noisy intermediate-scale quantum computers, and explain why it will allow neither good-quality quantum error-correction nor a demonstration of “quantum supremacy.”  Some general principles governing the behavior of noisy quantum systems are derived.

The new paper and lecture have the same title as my 2018 interview with Katia Moskvitch at Quanta Magazine (see also this post).  Note that Christopher Monroe has recently contributed a very interesting comment to the Quanta article. My paper is dedicated to the memory of Itamar Pitowsky, and for more on Itamar see the post Itamar Pitowsky: Probability in Physics, Where does it Come From? See also this previous post for two other quantum events in Jerusalem: a seminar in the first semester and a winter school on The Mathematics of Quantum Computation  on December 15 – December 19, 2019.

A slide from a lecture by Scott Aaronson where he explains why soap bubble computers cannot solve the NP-complete Steiner-tree problem. Noisy intermediate scale quantum (NISQ) circuits are computationally much more primitive than Scott’s soap bubble computers and this will prevent them from achieving neither “quantum supremacy” nor good quality quantum error correcting codes.  (source for the picture)

 

Low-entropy quantum states give probability distributions described by low degree polynomials, and very low-entropy quantum states give chaotic behavior. Higher entropy enables classical information. 

 

 

This entry was posted in Computer Science and Optimization, Physics, Quantum and tagged , . Bookmark the permalink.

9 Responses to The Argument against Quantum Computers – a CERN Colloquium and a New Paper

  1. Peter Shor says:

    Lattice QCD simulations and lattice QFT simulations take an incredible amount of computer time, generally produce good results, and don’t add any extra noise to the model. So why would we need extra noise to simulate quantum computers, but don’t need extra noise to simulate quantum chromodynamics?

    • Gil Kalai says:

      Dear Peter, greeting from Oberwolfach and thanks for the comment!

      My paper deals with noisy quantum circuits. For this model, error-correcting codes and the threshold theorem provided a very good argument for why universal quantum computer is possible – all that is needed is to push the level of noise below a certain constant. My argument gives good reason for why already in the NISQ domain it is impossible to reduce the noise level to achieve quantum supremacy and good quality quantum error correcting codes.

      You raise a different issue: Why noise is at all necessary for modeling quantum computers in the first place, and are models of high energy physics somehow “exempt” from noise? This is certainly an important question that many people discuss and that I also related to in my papers. Your comment is a good opportunity to come back to this point.

  2. John Sidles says:

    QCD presents (obviously) very different quantum dynamics from QED – both at high energies and at low energies – and that these quantum dynamical differences suggest multiple mechanisms by which the quantum simulation-complexity of QCD might (plausibly) be very different from the quantum simulation-complexity of QED.

  3. John Sidles says:

    Let’s imagine a world (for example) that is governed by the same QCD, QED, and gravitational dynamics as our world, in which the fundamental units of charge, mass, length, and time are stipulated to be the electron charge, the electron mass, the Bohr radius, and the inverse Hartree frequency (the Wikipedia page ‘Hartree atomic units’ describes these units).

    Our imagined world differs, however, in that the speed of light is not c = 1/\alpha ~ 137 (in natural atomic units), but 10,000x faster, i.e., c = 1.37×10^6. Save for this one change, the Standard Model applies in this “FastLight World” (as we will call it).

    The light-speedup of FastLight World has quite a number of dramatic consequences, which are great fun to try to calculate. For example, positronium lifetimes increase by a factor of c^3 ~ 10^12, whereas photon-photon scattering cross-sections decrease by an (astonishing) factor of 1/c^18 ~ 10^-72.

    Photons are so hard to observe in FastLight World; so much so, that we can imagine that FastLight Physicists don’t even know they exist. Nonetheless, with the replacement ‘photon’ -> ‘phonon’, pretty much all the ideas of QED field theory can be thoroughly explored in FastLight World, both theoretically and experimentally.

    E.g., the discussion of ‘masers’ in the Feynman lectures goes through without change, provided that the resonant microwave cavity is replaced by a capacitor in series with a piezoelectric crystal. More subtly, what we call “Hawking radiation” would be both theoretically predicted and experimentally observed in FastLight World, as the thermal phonons emitted by acoustic black holes associated to superfluid flow in converging channels.

    QCD would be computationally much easier to simulate in FastLight World (quark pair-production being strongly suppressed), while QED would be much harder to computationally simulated (qubit superradiant decoherence also being strongly suppressed).

    Equally interesting are the fundamental quantum dynamical phenomena that don’t change in FastLight World, most notably the Josephson constant (that governs the quantum dynamics of Josephson junctions) and the von Klitzing constant (that governs the integer quantum Hall effect). And of course, pretty much all of FastLight World’s (bio)chemistry and solid state physics is unaltered.

    This imagined FastLight World plausibly could achieve technological capacities comparable to our own, without quantum physicists requiring any notion of relativistic physics in general, or gauge field theories in particular … without the FastLight world’s quantum physicists ever realizing that photons existed, or that Galilean invariance is not exact.

  4. John Sidles says:

    What experiments would open the eyes of FastLight World’s quantum physicists to the physical reality of post-Galilean quantum physics? One plausible answer is, the generic failure of the Fastlight World’s attempts to demonstrate scalable Quantum Supremacy.

    A great many qubit schemes work much better in FastLight World, in consequence of greatly diminished electrodynamic decoherence. Thermal magnetic noise associated to metallic conductors is negligible, for example.

    Yet even in Fastlight World, Quantum Supremacy experiments generically fail whenever the number of physical qubits exceeds 1/\alpha ~ 1.37×10^6, in consequence of generically decoherent superradiant quantum electrodynamics.

    Scientifically speaking, the single most important result of FastLight World’s various Quantum Initiatives, would be an appreciation by FastLight physicists of the crucial role played by superradiant quantum dynamical coupling to the vacuum … a coupling that ensures both the practical failure of scalable Quantum Supremacy demonstrations, and (more importantly) the practical success of the Extended Church-Turing Thesis.

    In summary, superradiant quantum electrodynamical considerations suggest that the noise postulates of Kalai’s preprint are physically natural … a physical naturality that will be warmly welcome to all fans of the Extended Church Turing Thesis (but that’s another comment).

  5. vznvzn says:

    comparison with a soap bubble computer and the way that noise plays a role in both is very intriguing esp considering the new crosscutting fluid paradigm of QM + physics eg pilot wave hydrodynamics, madelung fluid etc.

    FLUID PARADIGM SHIFT 2018½

  6. Pingback: Jeff Kahn and Jinyoung Park: Maximal independent sets and a new isoperimetric inequality for the Hamming cube. | Combinatorics and more

  7. Pingback: Quantum computers: amazing progress (Google & IBM), and extraordinary but probably false supremacy claims (Google). | Combinatorics and more

  8. Pingback: Greatest Hits 2015-2022, Part II | Combinatorics and more

Leave a comment