Quantum computing: achievable reality or unrealistic dream

QC-michel-view QC-gilview

Michel Dyakonov’s View on QC                                     My view (based on Michel’s drawing*)

Update:

my_qc_world

Alexander Vlasov’s view (based on Michel and Konstantin’s drawing)

It has been a while since I devoted a post to quantum computation. Meanwhile, we had a cozy, almost private, easy-going, and very interesting discussion thread on my previous, March 2014 post (that featured my Simons Institute videotaped lectures (I,II).)

What can we learn from a failure of quantum computers?

Last week we had a workshop on “Quantum computing: achievable reality or unrealistic dream.” This was a joint venture of the  American Physics Society and the Racah Institute of Physics here at HUJI, organized by Professor Miron Ya. Amusia, and it featured me and Nadav Katz as the main speakers. Here are the slides of my lecture: What can we learn from a failure of quantum computers.

pred1

Noise Sensitivity and BosonSampling

Earlier, I gave a lecture in our CS colloquium about a recent work with Guy Kindler on noise sensitivity of BosonSampling. We show that for a constant level of noise, noisy BosonSampling can be approximated by bounded-depth computation, and that the correlation between the noisy outcome and the noiseless outcome tends to zero if the noise level is ω(1/n) where n is the number of bosons.  Here is the paper Gaussian noise sensitivity and BosonSampling, the videotaped lecture  Complexity and sensitivity of noisy BosonSampling, and the slides of the lecture.

Contagious error sources would need time travel to prevent quantum computation

On the positive side, Greg Kuperberg and I wrote a paper  Contagious error sources would need time travel to prevent quantum computation  showing that for a large class of correlated noise, (teleportation-based) quantum fault-tolerance works! Greg and I have had a decade-long email discussion (over 2000 emails) regarding quantum computers, and this work grew from our 2009 discussion (about my “smoothed Lindblad evolution” model), and heavily relies on  ideas of Manny Knill.

Nadav Katz: Quantum information science – the state of the art

Some years ago, two brilliant experimentalists, Hagai Eisenberg and Nadav Katz,  joined  the already strong, mainly theoretical, quantum information group here at HUJI.  Nadav Katz gave the second lecture in the workshop, and here are the slides of Nadav’s  lecture: Quantum information science – the state of the art.

nadavtalk

Experimental progress toward stable encoded qubits

Also very much on the positive side, Nadav mentioned a remarkable recent progress by the Martini’s group showing certain encoded states based on 9 physical qubits which are order-of-magnitude (factor 8.4, to be precise,) more stable than the “raw” qubits used for creating them !!

Here is a link to the paper:  State preservation by repetitive error detection in a superconducting quantum circuit, by J. Kelly, R. Barends, A. G. Fowler, A. Megrant, E. Jeffrey, T. C. White, D. Sank, J. Y. Mutus, B. Campbell, Yu Chen, Z. Chen, B. Chiaro, A. Dunsworth, I.-C. Hoi, C. Neill, P. J. J. O’Malley, C. Quintana, P. Roushan, A. Vainsencher, J. Wenner, A. N. Cleland, and John M. Martinis.

Update:  Further comments on a Shtetl-optimized post (especially a comment by Graeme Smith,) help to place the new achievement of the Martinis group within the seven smilestones toward quantum computers from a 2012 Science paper by Schoelkopf and Devoret, originated by David DiVincenzo’s 2000 paper “The physical implementation of quantum computation“. (You can watch these milestone here also .)

The new achievement of having a very robust realization of certain encoded states can be seen as achieving the 3.5 milestone.   The difference between the 3.5th milestone and the 4th milestone plays a central role in the seventh post of my 2012-debate with Aram Harrow in connection with a conjecture I made in the first post (“Conjecture 1″). Aram made the point that classical error-correction can lead to very stable encoded qubits in certain states (which is essentially the 3.5 milestone). I gave a formal description of the conjecture, which essentially asserts that the 4th milestone, namely insisting that encoded qubits allows arbitrary superpositions, cannot be reached.  As I said many times (see, for example, the discussion in my 2012 Simons Institute videotaped lecture 2), a convincing demonstration of the 4th milestone, namely  implementation of quantum error-correction with encoded qubits which are substantially more stable than the raw qubits (and allow arbitrary superposition for the encoded qubit) will disprove my conjectures. Such stable encoded qubits are  expected from implementations of distance-5 surface code. So we are 0.5 milestones away🙂

I will be impressed to see even a realization of distance-3 (or distance-5) surface code that will give good quality encoded qubits, even if the encoded qubits will have a quality which is somewhat worse than that of the raw qubits used for the encoding. These experiments, including those that were already carried out, also give various other opportunities to test my conjectures.

Peter Shor’s challenge #1 and my predictions from the failure of quantum computation

My lecture on predictions from the failure of QC is based on two lengthy recent comments (first, second) regarding predictions from the failure of quantum computers. On April 2014, Peter Shor challenged me with the following:

So if “quantum computers cannot work” is the premise of your theory, does your theory lead to any concrete predictions besides “quantum computers cannot work”? If it doesn’t, then I can’t help thinking that there really isn’t much substance to it.

While the discussion continued, it was not until November that I gave a detailed reply – a list of around 18 concrete predictions, along with other connections, possible applications, further vaguer predictions and speculations. (The comments have a much wider scope than the lecture.)

Peter Shor’s challenge #2:  Is particle physics immune to my hypothesis?

Following my longish prediction comments, Peter challenged me with yet another great question.

We already know that the standard formulation of quantum physics appears to be computationally difficult (I am talking about the formulation without smoothed Lindblad evolution).

What you are saying is that introduced noise will make the system behave differently from its current predictions and make it easier to simulate.

Right now, particle physics experiments are simulated using lattice QFT, this simulation is computationally very intensive, and simulating particle physics is believed to high computational complexity. If we extrapolate your prediction, you predict that particle physics experiments will actually behave differently from the standard predictions, and in a way that is easier to simulate.

Is this correct, or is particle physics for some reason immune to your hypothesis?

The very short answer is that particle physics is not immune to my hypothesis!

This is an extremely interesting topic that a few of us here at HUJI have been thinking about together in the last year or so. What are the best examples for “quantum supremacy” in the context of many-body quantum systems (be it from condensed matter physics, chemistry, nuclear or atomic physics, or high energy physics), and how can we test and challenge (even on a heuristic level) the quantum-supremacy claim. (This is a difficult conceptual issue and we also looked briefly on connections with chaotic behavior in classical systems.) We discussed quite a few potential candidates like energy levels and other features of heavy atoms, highly accurate atomic clocks, and more, and right now, the best candidate we have is QED computations for energy levels of the Hydrogen atom.

My renewed conversation with Peter (and also with Aram, Alexander Vlasov, and Klas) has led also to this TCS stackexchange question: Quantum algorithms for QED computations related to the fine structure constants, and to two questions: QCD and QED with unlimited computational power – how precise are they  going to be? and The fine structure constant – can it genuinely be a random variable? ,on the sister physics site.

I am not sure what is stronger: the fact that, in quantum physics, computations are lagging behind experiments, sometimes by several order of magnitudes — as an argument in favour of QC as a realistic possibility, or the (likely) fact that QC will enable to perform quantum physics computations, hundreds of orders of magnitude more accurately than experiments –as an argument against QC as a realistic possibility.

Michel and I, and Scott

Miron’s initial plan was to invite Michel Dyakonov, a prominent theoretical physicist with strong skeptical views regarding QC. While we are both skeptics regarding QC, Michel’s views are different from mine, see the pictures at the top. (Here is a link to a presentation of Michel from 2009 explaining his thoughts on the matter.) Michel who visited HUJI last winter does not expect failure of QC to lead to important physics predictions, but he prefers to draw some strong sociological insights. This winter, Scott Aaronson, who also has strong views on the matter of QC, was in town and it was fun to spend some time with him and chat face-to-face.

I remember a lengthy discussion on “polymath4” with many contributions from both Noam Nisan and me, and then how much better it was to actually meet Noam, in person, in the cafeteria one day. (This gap is even larger when it comes to debates and disagreements.) When I mentioned this to Terry Tao he told me that in Internet discussions he can actually “visualize” the other participants so it is almost like talking with them in person.

Updates: 1)  a related Shtetl-optimized post describes the Martini’s group experimental breakthrough as well as a theoretical one by Ed Farhi, Jeffrey Goldstone, and Sam Gutmann about a quantum algorithm for approximation of an NP-hard optimization problem.

2) Michel’s  picture at the top of this post was designed as an illustration to Michel’s 2001 presentation “Quantum computing: a view from the enemy camp” . The “enemy camp” is presented on the right side of the picture. The actual drawing was done by Michel’s son, 16 years old at that time. ( I remember that I when I read the paper I felt, or imagined,  some “anti-mathematics” sentiment. When I asked Michel about it last year, he told me that his son is actually a mathematician! )

miron's workshop  ws4ws1ws2ws3

DSC04300  PENTAX ImageALLm  IMG_4693

In the pictures you can see also Miron, Nadav, Guy and Scott, and the stormy sea at the Tel-Aviv old harbor as taken by Guy.

 

A videotaped lecture on BosonSampling

Update: A video of the debate

This entry was posted in Quantum and tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

7 Responses to Quantum computing: achievable reality or unrealistic dream

  1. vznvzn says:

    a very important topic. unfortunately there is some tendency to take the idea that qm computers may be impractical as an attack on the foundations of quantum physics which is not necessarily the case. noise in QM computing is turning out to be a very deep/ complex topic that will likely require a lot more interaction between theory and experiment.
    its great to see you cite dyakonov, it appears you did not eventually invite him. his writing is nearly polemical but some of the most lively scientific writing have had the pleasure to read in years. have cited him myself a few times. its dismaying but one cant really prove anything he says is wrong and he has excellent credentials, bkg, experience, and maybe the media should interview him 1/100th as much as massive advocates such as aaronson…..

    • Gil Kalai says:

      Well, my view is that “impracticality” of QC is related to fundamental issues within QM. Michel’s view is different. (Michel was invited but could not make it this time.) I find Scott’s advocacy over the media for QC, TOC, and the scientific endeavor as a whole, as very effective and useful.

      • vznvzn says:

        if its related to “fundamental issues,” then what qm axiom are you pointing to? heisenberg uncertainty principle? it appears qm computing is fully supported by axioms of QM, but thats the problem, those axioms are a strong formal abstraction of physical reality… re SA, ofc agree his advocacy is effective/ useful, but also largely unchallenged at least in the media. there seem to be no major media presentations of a contrarian view on QM feasibility.

  2. I could say that point with skepticism is a bit tricky in QC because it would be difficult to imagine argumentation to reject QC with (say) 99% probability and even in such case we would have 1% probability of success and that may be enough for further investigation because fast factoring is too serious topic …

    • Gil Kalai says:

      Dear Alexander, the probability we should assign for success of building QC is, in my opinion, much higher than 1%. The question represents an uncharted scientific territory, there are many excellent people in physics, computer science and mathematics, who regard it possible, and even regard it as a solid prediction of QM. (Some even regard it as a scientifically settled matter which mainly depends now on engineering.)

      The fruits from working QC are also indeed tremendous: you mentioned factoring, but the ability to simulate quantum systems of arbitrary nature will probably be at least as important. If quantum supremacy is a real phenomenon and QC are possible, it makes it much more likely that they can be useful or even necessary for describing and understanding natural quantum systems and for designing new quantum systems. As I said in the lecture, even if you give QC a probability of epsilon the expected fruits from it are much above 1/epsilon, maybe 1/epsilon^2🙂 .

      Regardless if QC can be built or not, the QC endeavor is a new way to “frame” and think about many areas of quantum physics. It already led to important developments in many areas of theoretical and experimental physics, the theory of computing, and mathematics.

      But the good thing about it is that we will have plenty of opportunities to update our subjective probability. Both from the theoretical angle and mainly from the experimental side we can expect new evidence. For me, the question if quantum fault tolerance can lead to sufficiently better encoded qubits (including superpositions; for the full Bloch sphere), is a crucial question. Similarly, how difficult it will be to scale up BosonSampling is crucial, and so is the experimental quest for convincing demonstrations of certain “new phases of matter” (those which convincingly do not represent small bounded depth quantum computation).

      But of course, understanding the failure of quantum computers and “quantum supremacy” can also be a fruitful scientific journey. My personal opinion is that this side of the story will prevail.

  3. Pingback: 20 years in quantum computing | Are You Shura?

  4. Pingback: Ada the Amplifier | Gödel's Lost Letter and P=NP

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s