Quanta Magazine published an interview with me about quantum computers. It was a pleasure discussing this issue with Katia Moskvitch and I enjoyed also the photo session with David Vaaknin who also took a video of me explaining the importance of quantum noise.

⭐ nice, great to hear from a contrarian/ skeptic & strong new member of the Quantum Orthodox Doubters Society originally started by Bell decades ago. ps you might also consider P=BQP amidst all the QM computing mania. also for amusement sometime read the papers of Dyakonov…

Bell was really the problem and too many people rejected Copenhagen. The problem is that people can’t get out of classical thinking about quanta. When you force classical reality on the quantum world, you make people think crazy stuff such as probabilities being something objective that interfere. Reality is down there but it’s not the reality we are used to. That in no way implies that QCs possible. It’s COMPLETELY FALSE that QCs follow from QM. That is WRONG. Uncertainty leads to speedups? Not proven. It’s clearly debatable right from first principles. I totally agree with Roger Schlafly on this (see his blog over at Dark Buzz). QM gives you expected values of observables. That’s it. He is one of the few people willing to question this stuff PROPERLY. I totally agree with him that QCs follow from the Schroedinger’s cat fallacy. You have to be a multi-worlds believer to think they make any sense. Probability is used in QM for the same reasons it is used with classical physics, to quantify uncertainty: “Both A and B (H) are examples of von Neumann algebras, and a generalization of classical measure theoretic probability can be developed by defining generalized probability distributions to be positive normalized functionals on such algebras. This generalized theory has both classical probability theory and quantum theory as special cases.” Bayesian views that are psi-epistemic have been reviewed here: https://arxiv.org/pdf/1409.1570.pdf
Apparently, physicists have only recently discovered graph theory. lol

I should mention that Scott Aaronson commented the following recently: “Which interpretation of QM you espouse (e.g., MWI, Copenhagen, or Bohm) has no effect—none, zero—on what you should predict about the scalability of quantum computation, because by explicit design, all interpretations make exactly the same predictions for any experiment you can do on any system external to yourself.” However, he says the performance advantage of QCs are “all about being able to choreograph interference patterns in a Hilbert space of exponentially large dimension.”

His view of QM is supposing something additional to it, namely mathematical realism. Given his view of what constitutes QM, I agree but that’s not QM. QM is a minimalist theory. He might want to look up the ensemble interpretation. https://en.wikipedia.org/wiki/Ensemble_interpretation

Probability is used to quantify counterfactuals (note: many are meaningless) but probability is just an interpretation. Probabilities are not something you can empirically show to exist and it used in ways that treat quanta like they are classical. The probability of an electron’s location is assuming an electron is a particle. It isn’t! QM gives expected values of observables and we use probability for the same reason we use it in classical physics. Non-commuting observables? Big deal. The uncertainty principle is the entire basis of QM.

“Scott says the core of the quantum voodoo is amplitude interference. But all sorts of classical phenomena have interfering waves, and that is not particularly mysterious. It only becomes mysterious when you think of those amplitudes as probabilities or generalized probabilities.” http://blog.darkbuzz.com/2016/12/comic-about-quantum-computing.html

Scott says that there are “positive and negative amplitudes” that interfere but the wave function for an electron is a complex-coefficient spinor function. It’s not so simple. Again, who ever said it was probabilities interfering?

Terry Tao: “Note carefully that sample spaces (and their attendant structures) will be used to model probabilistic concepts, rather than to actually be the concepts themselves. This distinction (a mathematical analogue of the map-territory distinction in philosophy) actually is implicit in much of modern mathematics, when we make a distinction between an abstract version of a mathematical object, and a concrete representation (or model) of that object.”

Nathaniel David Mermin writes, “In a 1931 letter from Erwin Schrödinger to Arnold Sommerfeld: ‘Quantum mechanics forbids statements about what really exists–statements about the object. It deals only with the object-subject relation. Although this holds, after all, for any description of nature, it appears to hold in a much more radical and far-reaching sense in quantum mechanics.'”

Scott is WRONG to say QCs trivially follows from QM. Not true. Fake news. As a matter of fact, the success or failure of QCs will tell us something deeper than existing QM does. He says this himself: “The one thing in foundations of QM that does matter for QC, is simply whether you believe QM is literally true or whether you think it needs to be modified. As I never tire of pointing out (because others never tire of forgetting it), if QM did need to be modified, that would be a far greater scientific breakthrough than a mere success in building scalable QCs, and we can only hope that the quest to build QCs would terminate with such an exciting outcome.”

see also Preskill on NISQ noisy intermediate stage quantum. it appears even the proponents have conceded. more on QM computing overview from last summer

This stuff doesn’t rise to the level of laughter. Any Bayesian refutes this stuff before they even start talking about noise. Even considering noise, the debates were fairly definitive in the 90s. For instance, Subhash Kak in 1999: https://arxiv.org/abs/quant-ph/9805002

People keep ignoring the Bayesian issue here. The Born rule is just metaphysical fluff and QM simply produces expected values of observables. Probabilities are derived (psi is complex, spinor, vector, negative, etc. and is a real number) and can been seen as a lack of information about the system. In the words of Heisenberg, “One may call these uncertainties [i.e. the Born probabilities] objective, in that they are simply a consequence of the fact that we describe the experiment in terms of classical physics; they do not depend in detail on the observer. One may call them subjective, in that they reflect our incomplete knowledge of the world.” http://www.math.ru.nl/~landsman/Born.pdf

Those like Lubos Motl tell me that “probabilities interfere all the time” but there is no empirical evidence of this, the same way we never observe an electron that is 1/4 in a cavity and 3/4 out. If probabilities are late in the calculation then “amplitudes” might interfere but not probabilities. The game is up before you even begin. QC people are repeating the Schroedinger’s cat fallacy. You can’t run Turing complexity arguments backwards but that’s what Feynman did. Uncertainty leads to speedups? ROFL!

am quite sympathetic with these views. what would you say is the best survey/ overview of Bayesian interpretation of QM? have maybe heard a little but am not familiar.

believe have seen some born-rule like measurement laws/ strong analogy outside of QM in classical physics but havent been able to nail it down exactly myself (anyone else know of any refs?). it seems almost nobody is drawing the analogy. think that needs to change asap! in particular it seems to me related to the root-mean-square measurement of intensity or avg power of a wave…

it seems there is so much uncanny similarity of QM formalism with classical wave mechanics in so many ways but again, really struggle to find refs that draw the analogy as tightly as possible. seems its almost as if its been heaviily obscured by interpretational bias. “when you have a hammer everything looks like a nail.” the hammer seems to be something like the copenhagen interpretation.

Pingback: Futureseek Daily Link Review; 9 February 2018 | Futureseek Link Digest

⭐ nice, great to hear from a contrarian/ skeptic & strong new member of the Quantum Orthodox Doubters Society originally started by Bell decades ago. ps you might also consider P=BQP amidst all the QM computing mania. also for amusement sometime read the papers of Dyakonov…

Bell was really the problem and too many people rejected Copenhagen. The problem is that people can’t get out of classical thinking about quanta. When you force classical reality on the quantum world, you make people think crazy stuff such as probabilities being something objective that interfere. Reality is down there but it’s not the reality we are used to. That in no way implies that QCs possible. It’s COMPLETELY FALSE that QCs follow from QM. That is WRONG. Uncertainty leads to speedups? Not proven. It’s clearly debatable right from first principles. I totally agree with Roger Schlafly on this (see his blog over at Dark Buzz). QM gives you expected values of observables. That’s it. He is one of the few people willing to question this stuff PROPERLY. I totally agree with him that QCs follow from the Schroedinger’s cat fallacy. You have to be a multi-worlds believer to think they make any sense. Probability is used in QM for the same reasons it is used with classical physics, to quantify uncertainty: “Both A and B (H) are examples of von Neumann algebras, and a generalization of classical measure theoretic probability can be developed by defining generalized probability distributions to be positive normalized functionals on such algebras. This generalized theory has both classical probability theory and quantum theory as special cases.” Bayesian views that are psi-epistemic have been reviewed here: https://arxiv.org/pdf/1409.1570.pdf

Apparently, physicists have only recently discovered graph theory. lol

I should mention that Scott Aaronson commented the following recently: “Which interpretation of QM you espouse (e.g., MWI, Copenhagen, or Bohm) has no effect—none, zero—on what you should predict about the scalability of quantum computation, because by explicit design, all interpretations make exactly the same predictions for any experiment you can do on any system external to yourself.” However, he says the performance advantage of QCs are “all about being able to choreograph interference patterns in a Hilbert space of exponentially large dimension.”

His view of QM is supposing something additional to it, namely mathematical realism. Given his view of what constitutes QM, I agree but that’s not QM. QM is a minimalist theory. He might want to look up the ensemble interpretation.

https://en.wikipedia.org/wiki/Ensemble_interpretation

Probability is used to quantify counterfactuals (note: many are meaningless) but probability is just an interpretation. Probabilities are not something you can empirically show to exist and it used in ways that treat quanta like they are classical. The probability of an electron’s location is assuming an electron is a particle. It isn’t! QM gives expected values of observables and we use probability for the same reason we use it in classical physics. Non-commuting observables? Big deal. The uncertainty principle is the entire basis of QM.

“Scott says the core of the quantum voodoo is amplitude interference. But all sorts of classical phenomena have interfering waves, and that is not particularly mysterious. It only becomes mysterious when you think of those amplitudes as probabilities or generalized probabilities.”

http://blog.darkbuzz.com/2016/12/comic-about-quantum-computing.html

Scott says that there are “positive and negative amplitudes” that interfere but the wave function for an electron is a complex-coefficient spinor function. It’s not so simple. Again, who ever said it was probabilities interfering?

Terry Tao: “Note carefully that sample spaces (and their attendant structures) will be used to model probabilistic concepts, rather than to actually be the concepts themselves. This distinction (a mathematical analogue of the map-territory distinction in philosophy) actually is implicit in much of modern mathematics, when we make a distinction between an abstract version of a mathematical object, and a concrete representation (or model) of that object.”

Nathaniel David Mermin writes, “In a 1931 letter from Erwin Schrödinger to Arnold Sommerfeld: ‘Quantum mechanics forbids statements about what really exists–statements about the object. It deals only with the object-subject relation. Although this holds, after all, for any description of nature, it appears to hold in a much more radical and far-reaching sense in quantum mechanics.'”

Scott is WRONG to say QCs trivially follows from QM. Not true. Fake news. As a matter of fact, the success or failure of QCs will tell us something deeper than existing QM does. He says this himself: “The one thing in foundations of QM that does matter for QC, is simply whether you believe QM is literally true or whether you think it needs to be modified. As I never tire of pointing out (because others never tire of forgetting it), if QM did need to be modified, that would be a far greater scientific breakthrough than a mere success in building scalable QCs, and we can only hope that the quest to build QCs would terminate with such an exciting outcome.”

see also Preskill on NISQ noisy intermediate stage quantum. it appears even the proponents have conceded. more on QM computing overview from last summer

https://vzn1.wordpress.com/2017/07/17/qm-computing-summer-2017-update/

This stuff doesn’t rise to the level of laughter. Any Bayesian refutes this stuff before they even start talking about noise. Even considering noise, the debates were fairly definitive in the 90s. For instance, Subhash Kak in 1999: https://arxiv.org/abs/quant-ph/9805002

People keep ignoring the Bayesian issue here. The Born rule is just metaphysical fluff and QM simply produces expected values of observables. Probabilities are derived (psi is complex, spinor, vector, negative, etc. and is a real number) and can been seen as a lack of information about the system. In the words of Heisenberg, “One may call these uncertainties [i.e. the Born probabilities] objective, in that they are simply a consequence of the fact that we describe the experiment in terms of classical physics; they do not depend in detail on the observer. One may call them subjective, in that they reflect our incomplete knowledge of the world.” http://www.math.ru.nl/~landsman/Born.pdf

Those like Lubos Motl tell me that “probabilities interfere all the time” but there is no empirical evidence of this, the same way we never observe an electron that is 1/4 in a cavity and 3/4 out. If probabilities are late in the calculation then “amplitudes” might interfere but not probabilities. The game is up before you even begin. QC people are repeating the Schroedinger’s cat fallacy. You can’t run Turing complexity arguments backwards but that’s what Feynman did. Uncertainty leads to speedups? ROFL!

Baby stuff.

The formatting fell out: (open bracket)psi|A|psi(close bracket) is a real number.

am quite sympathetic with these views. what would you say is the best survey/ overview of Bayesian interpretation of QM? have maybe heard a little but am not familiar.

believe have seen some born-rule like measurement laws/ strong analogy outside of QM in classical physics but havent been able to nail it down exactly myself (anyone else know of any refs?). it seems almost nobody is drawing the analogy. think that needs to change asap!

in particular it seems to me related to the root-mean-square measurement of intensity or avg power of a wave…it seems there is so much uncanny similarity of QM formalism with classical wave mechanics in so many ways but again, really struggle to find refs that draw the analogy as tightly as possible. seems its almost as if its been heaviily obscured by interpretational bias. “when you have a hammer everything looks like a nail.” the hammer seems to be something like the copenhagen interpretation.

some recent musings on related ideas here

https://vzn1.wordpress.com/2017/09/08/latest-on-killing-copenhagen-interpretation-via-fluid-dynamics/

Pingback: The Argument against Quantum Computers – a CERN Colloquium and a New Paper | Combinatorics and more