Boaz Barak: The different forms of quantum computing skepticism

Boaz Barak  wrote a nice essay which I reblog to you below about skepticism of quantum computers. His essay is centered around a description of four scenarios (or possible “worlds” in the style of Russel Impagliazzo) about the matter. Boaz’ world are called Superiorita (quantum computers are realistic and computational superior), Popscitopia  (Quantum computers are realistic and can solve NP-complete problems), Skepticland  (quantum computers are not realistic), and Classicatopia (quantum algorithms can efficiently be simulated on a classical computers)

Apropos Russel’s universes, Boaz told me about a new cryptography “world” called Obfustopia that can be found (along with some modification of the original worlds) in Boaz’ survey paper. I heard about it and about the related important LWE (learning with errors) problem also from Alon Rosen.

Boaz’ essay also includes Boaz’ own optimistic personal view and also some very brief critique of my skeptical stance. Boaz conclude his personal opinion with:

The bottom line is that, as far as I can tell, Superiorita is the most beautiful and evidence-supported world that is currently on offer.

Boaz’ main argument for his point of view is:

…as far as I can tell, these engineering difficulties are not fundamental barriers and with sufficient hard work and resources the noise can be driven down to as close to zero as needed.

This is indeed the crux of matters and my analysis gives good reasons to think that Boaz is not correct.  The different opinions are described in the pictures below and the crux of matters is “can we cross the green line?”

Common expectations

It is a common belief that by putting more effort for creating qubits the noise level can be pushed down to as  close to zero as we want. Once the noise level is small enough and crosses the green line, quantum error correction allows logical qubits to reduce the noise even further with a small amount of additional effort . Very high quality topological qubits are also expected.

 

 

The picture suggested by my analysis

My analysis (see here and here and here) gives good reasons to expect that we will not be able to reach the green line and that all attempts for logical and topological qubits will yield bad quality qubits.

Finer Worlds

One can make a finer division of the superiorita world, and, in my view, the three most relevant worlds are the following:

Quantopia  – The model of quantum circuits is the correct description of local quantum systems in nature and therefore universal quantum computation is possible. Quantum supremacy and robust quantum information are present and perhaps even ubiquitous in the physical world. Quantum systems, quantum information and computation are analogous to classical systems, classical information and computation. Quantum error correction is analogous to classical error correction: an important engineering tool but not the thing that makes the qualitative difference whether scalabale communication or computation is possible. (This represents Boaz’ beliefs expressed in his post and his later comments.)

Superiorita (Quantum noise below the threshold.) Quantum systems are inherently noisy. Time-dependent quantum evolutions necessarily interacts with the environment and are therefore  noisy. The model of noisy quantum circuits is the correct description of local quantum systems in nature. Quantum error-correction  shows that small islands representing noiseless quantum circuits can be created and may also exist in nature. Hence quantum computation is possible. (This is perhaps the most common view among researchers in quantum information.)

Skeptica– (Quantum noise above the threshold.) Quantum systems are inherently noisy. Time-dependent quantum evolutions necessarily interacts with the environment. The model of noisy quantum circuits is the correct description of local quantum systems in nature. The noise level cannot be pushed below the level allowing quantum error-correction and quantum fault-tolerance. Hence quantum computation is not possible. Quantum supremacy is not demonstrated in nature and cannot be demonstrated in the laboratory. (This is were I stand).

There are, of course people who are skeptical about quantum computers from other reasons like the young Boaz who simply did not like physics. There are people who for various reasons are skeptical of quantum mechanics.

Quantopia  suggests that “quantum supremacy” and “robust quantum information” will be present and in fact ubiquitous in the physical world, while under ” superiorita” quantum supremacy represents rare a islands inside a large “decoherence desert” (as Daniel Gottesman referred to it in his beautiful picture portrayed in this post.) The difference between quantopia and  superiorita is relevant to Scott Aaronson’s hope that quantum computers promise  trillion dollars industry via simulations. While additional computing power is always welcome this idea is less promising in the  superiorita scenario where the usefulness of quantum computers to simulation is less clear.

Anyway, without even further ado, here is Boaz’ piece.

Windows On Theory

(see also pdf version)

Quantum computing is one of the most exciting developments of computer science in the last decades. But this concept is not without its critics, often known as “quantum computing skeptics” or “skeptics” for short. The debate on quantum computing can sometimes confuse the physical and mathematical aspects of this question, and so in this essay I try to clarify those. Following Impagliazzo’s classic essay, I will give names to scenarios or “potential worlds” in which certain physical or mathematical conditions apply.

Potential worlds

Superiorita is the world where it is feasible to build scalable quantum computers, and these computers have exponential advantage over classical computers. That is, in superiorita there is no fundamental physical roadblock to building large quantum computers, and hence the class BQP is a good model of computation that is physically realizable. More precisely, in superioriata the amount of resources (think…

View original post 2,547 more words

Advertisements
Posted in Quantum | Tagged | 5 Comments

Bálint Virág: Random matrices for Russ

You can watch now all the videos for Russfest Elegance in Probability. Many great talks! I just watched Bálint Virág‘s lecture “Random matrices for Russ”. Highly recommended.

Posted in Combinatorics, Probability | Tagged , , | Leave a comment

Test Your Intuition 33: The Great Free Will Poll

Free will is defined (following Wikipedea) as the ability of humans to choose between different possible courses of action unimpeded. But you may take your favorite definition of free will.  Philosophers (and others) have debated the definition of “free will” and the question if humans have free will for over two thousands years.

Answer at your free will

Posted in Philosophy | Tagged | 14 Comments

Must-read book by Avi Wigderson

The joyous lion of theoretical computer science

(Read here a story about Avi and me related to the above picture.)

Boaz Barak is reporting on Avi Wigderson’s book Mathematics and Computation which is on Avi’s homepage. I read through an earlier version and it is highly recommended! Let me also recommend Avi’s earlier survey on interaction between CS and math.

Windows On Theory

Avi Wigderson is one of the most prolific and creative theoretical computer scientists (in fact, he is one of the most prolific and creative scientists, period). Over the last several years, Avi had worked hard into distilling his vast knowledge of theoretical computer science and neighboring fields into a book surveying TCS, and in particular computational complexity, and its connections with mathematics and other areas.

I’m happy to announce that he’s just put a draft of this upcoming book on his webpage.

The book contains a high level overview of TCS, starting with the basics of complexity theory, and moving to areas such as circuit complexity, proof complexity, distributed computing, online algorithms, learning, and many more. Along the way there are interludes about the connections of TCS to many mathematical areas.

The book is highly recommended for anyone, but in particular for undergraduate and beginning graduate students that are interested…

View original post 140 more words

Posted in Combinatorics, Computer Science and Optimization, Updates | Tagged | Leave a comment

High Dimensional Combinatorics at the IIAS – Program Starts this Week; My course on Helly-type theorems; A workshop in Sde Boker

The academic year starts today. As usual it is very hectic and it is wonderful to see the ever younger and younger students. Being a TelAvivian in residence in the last few years, I plan this year to split my time between HUJI and TAU.

Let me tell you about a special year taking place at the Israeli Institute of Advanced Science at Jerusalem on Geometric, Topological and Computational Aspects of High-Dimensional Combinatorics, the opening workshop of this conference at Sde boker, and my course on Helly-type theorems.

Continue reading

Posted in Combinatorics, Computer Science and Optimization, Geometry, Updates | Tagged , , | 4 Comments

Stan Wagon, TYI 23: Ladies and Gentlemen: The Answer

TYI 32, kindly offered by Stan Wagon asked

A round cake has icing on the top but not the bottom. Cut out a piece in the usual shape
(a sector of a circle with vertex at the center), remove it, turn it upside down, and replace
it in the cake to restore roundness. Do the same with the next piece; If the central angle
of the pieces is 181°, what is the number of steps needed so that, under repeated cutting-and flipping, all icing returns to the top of the cake?

Here are your answers:

And the correct answer is:

Continue reading

Posted in Combinatorics, Geometry, Test your intuition | Tagged , | 8 Comments

Ladies and Gentlemen, Stan Wagon: TYI 32 – A Cake Problem.

The following post was kindly contributed by Stan Wagon. Stan (Wikipedea) is famous for his books, papers, snow-sculptures, and square-wheels bicycles (see picture below) !  

 

A round cake has icing on the top but not the bottom. Cut out a piece in the usual shape
(a sector of a circle with vertex at the center), remove it, turn it upside down, and replace
it in the cake to restore roundness. Do the same with the next piece; i.e., take a piece
with the same vertex angle, but rotated counterclockwise from the first one so that one
boundary edge coincides with a boundary edge of the first piece. Remove it, turn it
upside-down, and replace it. Keep doing this in a counterclockwise direction. The figure
shows the situation after two steps when the central angle is 90°. If θ is the central angle
of the pieces, let f (θ) be the number of steps needed so that, under repeated cutting-andflipping just described, all icing returns to the top of the cake, with f (θ) set to ∞ if this
never happens. For example, f(90°) = 8.

Test your intuition: What is f(181°)?

Source info:
Continue reading

Posted in Combinatorics, Guest blogger, Uncategorized | Tagged , , | 7 Comments

If Quantum Computers are not Possible Why are Classical Computers Possible?

As most of my readers know, I regard quantum computing as unrealistic. You can read more about it in my Notices AMS paper and its extended version (see also this post) and in the discussion of Puzzle 4 from my recent puzzles paper (see also this post). The amazing progress and huge investment in quantum computing (that I presented and update  routinely in this post) will put my analysis to test in the next few years.

Guy Kindler and I identified a very primitive complexity class LDP that describes noisy quantum systems in the small scales (few bosons, qubits etc.) 

Today I would like to discuss, with the help of illustrations, a central issue in the debate about quantum computing: In view of a putative impossibility (and obvious difficulty) of quantum computing, why  is classical computing possible (and so ubiquitous)? This was a major challenge that Aram Harrow raised in our 2012 debate (in this post), and it goes back to Dave Bacon and others. My 2014 work with Guy Kindler on noisy BosonSampling (taking them as an analogy for noisy quantum circuits) leads to a fairly detailed answer that I will explain below, mainly with illustrations.

The picture behind the question

A common view: in principle, there is no difference between achieving quantum computing via quantum error-correcting codes and achieving classical computing via classical error-correcting codes. 

Why are classical information and computation possible?

Encoding by repetition, and decoding by “majority” are both supported by low degree polynomials

Encoding by repetition refers to a logical ‘one’ encoded by many physical ones and the same with zero.  Approximate version of majority enables decoding. Both encoding and decoding are supported by low degree polynomials. (A variant which is more realistic is that one is encoded by 70% ones (say) and zero by 30% ones.)

Why are robust quantum information and quantum computation impossible?

It is commonly expected that creating good-quality quantum error correcting codes is harder than demonstrating quantum supremacy.

Unlike the repetition/majority mechanism which is supported by very primitive computational power, creating a quantum error correcting code and the easier task of demonstrating quantum supremacy are not likely to be achieved by devices which are very low-level in terms of computational complexity.

This is the difference!

(Quantum supremacy refers to the ability of quantum computers to perform computations which are infeasible for classical computers. )

The computational complexity picture

Bounded depth computation (AC^0) is an extremely primitive computational complexity class. Sampling tasks that can be performed approximately by low degree polynomials represent an even more primitive computational task. (The proof that LPD is in AC^0 is quite clever but seems to Guy and me quite irrelevant to quantum computing 🙂 )

LDP is so low-level that it allows efficient learning!

(This leads to the following prediction: distributions coming from pure quantum states that can be approached by noisy quantum devices  allows efficient approximate learning.)

The underlying principles

Very simple, isn’t it?

The notion of “primitive computational devices” in both principles applies to the asymptotic behavior. The second principle extends to quantum devices that do not involve quantum error-correction.

Can you be a little more formal?

Let me try. (Trying to understand how insights from the theory of computation relate to real life computation is quite important.)

I don’t know how general is this insight. (I asked about it in TCSoverflow.) Note that this insight gave the rationale for the threshold theorem to start with.

My older stuff about correlated errors

My earlier work (Section 6 of the Notices paper, and earlier works cited there) proposes other goals that appear to be  easier than creating quantum error correcting codes and I expect them to be impossible.

The remarkable IBM machine applies two-qubit CNOT gates. One can expect that errors for the gated qubits have large positive correlation. This is not controversial but it would be nice to check it.

You can implement CNOT gate indirectly by a chain of other gates. I expect that it will not be possible  to reduce the positive correlation by a process that reaches the CNOT indirectly.  (You can experience the IBM quantum computer  here.)

A question by Ady Stern

It is a positive and not terribly small constant!  I am not sure what “fundamental” means.

(Sometimes, I get corrected when I pronounce my name correctly.)

The quality of my argument

sulam24x

But maybe the processes for quantum error-correction have more than meets the eye?

Why not just improve the physical qubits?

Q: But why can’t you simply create good enough qubits to allow universal quantum circuits with 50 qubits?

A: This will allow very primitive devices (in terms of the asymptotic behavior of computational complexity) to perform superior computation.

Q: This is an indirect explanation. Can you give an explanation on the level of a single qubit?

A: Yes, if the qubit is of too high quality, the microscopic process leading to it also demonstrates a primitive device leading to a superior computation.

Topological quantum computing?

What about topological quantum computing?

See this videotaped lecture.

Is Nature a malicious opponent of quantum computing?

My model of errors is precisely the ordinary model.

What do you mean by “primitive computational devices”?

The fate of Quantum Mechanics

What about the huge efforts and resources for building quantum computers or intermediate tasks?

Also the Commonwealth Bank

I talked about the problem with Dave Bacon in 2006

(Inspiration: math with bad drawing)

Posted in Computer Science and Optimization, Physics, Quantum | Tagged , | 4 Comments

Sergiu Hart: Two-Vote or not to Vote

 

Sergiu Hart raises a very interesting idea regarding elections. Consider the Brexit referendum. Sergiu  proposes to have two rounds two weeks apart.  Every voter can vote in each, and the votes of both rounds add up! The outcomes of the first round are made public well before the second round.

In the paper Repeat Voting: Two-Vote May Lead More People To Vote Segiu Hart suggests the following method:

A. Voting is carried out in two rounds.

B. Every eligible voter is entitled (and encouraged) to vote in each one of the two rounds.

C. All the votes of the two rounds are added up, and the final election result is obtained by applying the current election rules to these two-round totals.

D. The results of the first round are officially counted and published; the second round takes place, say, two weeks after the first round, but no less than one week after the official publication of the first round’s results

(The pun in the title is taken from an early version by Sergiu.)

What do you think?

Posted in Economics, Games | Tagged , | 13 Comments

A toast to Alistair: Two Minutes on Two Great Professional Surprises

Alistair and the Simons Institure friendly and helpful staff 

Luca Trevisan invited me to give a 3-minute (vidotaped or live) toast for Alistair Sinclair to celebrate that Alistair much deservedly received the SIGACT service award and to mourn that he also just retired from his role of associate director of the Simons Institute. These toasts will be part of FOCS 2017 Saturday evening reception (October 14) which will be hosted by the SI and turn into a party for Alistair. It is great that leading scientists excel also in the all so important academic administration tasks. I can think in this context about Michael Rabin who was the rector (provost) of my university and many others.

But I thought it could be a good idea to mainly mention in my toast two of Alistair’s great and mind-boggling scientific contributions. Miraculously, I had a videotaped which was lying on the editing floor for three years.  It was part of the first ever  (and maybe the only ever) Simons Institute videotaped production. (The toast video was done by me using my smartphone and it took many takes to do.) So you can hear for one minute and a half quick stories about rapid mixing and approximate permanents and inspiring persistence and volume.

From the editing floor: Two Minutes on approximating permanents, rapidly mixing random walks in algorithms, and volumes

Click here for the toast video for Alistair!

This brief video was edited out from my videotaped lecture on quantum computers (see this earlier post). When we prepared the videos, I was quite excited by the fact that we do not  need to shoot the video in the right order. (I even use this fact to outline a major difference between classical computation and quantum computation.)  We first shot the pictorial + entertaining  ending  of Video II. Then we moved to shoot Video I and the plan was to start it with Alistair Sinclair introducing me. In the beginning of the unedited video, I say to Tselil in Hebrew that we do not need to bring Alistair up to the production room  since he can record his introduction later.  (At the end we decided not to have an introduction at all.) As seen from my smile, when I thanked Alistair for his yet-to-be-given introduction I felt as sophisticated as Niccolò Machiavelli.

And here are links to the papers mentioned: Approximating the Permanent by Jerrum and Sinclair, A polynomial-time approximation algorithm for the permanent of a matrix with non-negative entries by Jerrum, Sinclair and Vigoda (and here is the Journal version), and A random polynomial time algorithm for approximating the volume of convex bodies by Dyer, Frieze and Kannan.

Indeed these amazing algorithmic applications of  rapidly mixing random walks came as great surprises and are extremely important!

Lovely ending presentation, forgotten pictures of Avi and Oded, Jesus, the camel and Bob Simons, and Scott Aaronson’s 100,000$ promise.

Talking about my old Simon Institute lectures videos, let me first recommend to you the two minutes ending of the second video – lovely pictures and a wonderful song in the background. Can you identify the people there? (There are 49 altogether!)

  

The second video includes (toward the end, click here to jump to this part) a presentation of some pictures +funny things related (more or less) to the debate on quantum-computers. It contains (click to jump) a forgotten picture of Avi Wigderson and Oded Goldreich, Continue reading

Posted in Computer Science and Optimization, Updates | Tagged | Leave a comment