thanks ]]>

**Gil Kalai** contemplates “E.g., for n=10 and m=20 we consider the 200-dimensional algebraic variety (of decomposable symmetric tensors parametrized by 10 by 20 complex matrices) inside a 20,000,000 dimensional Hilbert space (symmetric tensors) […] it is possible that small-degree polynomials [...] allows already good approximation for realistic noisy quantum systems.”

Gil, that is an illuminating observation!

Indeed, the generic experience of simulationists is that the low-dimensional varietal submanifolds immerse in Hilbert-space as **a space-filling “foam”** that supports the flow of noisy/Lindblad trajectories — and thus their PTIME simulation as Carmichael unravelings — while rigorously respecting the various conservation laws, thermodynamic laws, and gauge invariances that are the “crown jewels” of 21st century quantum dynamics.

Quantum research threads that concretely are “flowing” in this dynamical/geometric direction include (for example) Hamiltonian lattice gauge theories; see for example Buyens, Haegeman, van Acoleyen, Verschelde, and Verstraete, “Matrix Product States for Gauge Field Theories” (arXiv:1312.6654) and Bañuls, Cichy, Cirac, Jansen, Saito, “Matrix Product States for Lattice Field Theories” (arXiv:1310.4118).

Can these algebraic/geometric/varietal simulation methods help to accelerate the experimental observation of nonabelian anyonic field-dynamics? Can these methods similarly be adapted to help optimize BosonSampling *mises en pratique*? Can these methods contribute to the rising tide of everyday technologies that press against quantum limits? Can the researchers of any young person, who is equipped with a laptop computer, an internet connection, and an inquiring mind, be lifted by this *marée montante*?

The evident answers to these questions are (as it seems to me and many, and by the evidence of the recent literature) yes, yes, yes, and yes.

]]>I have two additional remarks. First, Peter, it is not the case that my predictions (conjectures) come to play just when somebody tries to have quantum computing. In fact, I make an effort to offer predictions that can be tested well before we reach the stage of quantum computers, e.g., predictions for two (physical) qubits and for a single (encoded) qubit as well as predictions that apply to much more general quantum systems without even the tensor product structure. (There is an issue that I do not specify the constants and that some of the predictions are formulated in an asymptotic manner. I expect that the constants will be such that the predictions will come to play for quite small systems.) For anyons my prediction is simply that the mass gap allowing quantum computation cannot be approached. In the related attempts to create very stable qubits based on surface code, I mentioned various times that I expect that my conjectures will come into play in a substantial way already for the proposal for distance-3 codes involving 20 qubits or so. (This may be examined even via detailed simulations.) My prediction is that we cannot create up to small errors encoded qubits (where “qubit” refers to an encoded state with an arbitrary prescribed superposition).

As a matter of fact, some of my conjectures and, in particular, the issue of correlated errors for entangled qubits (which I am sure people are interested in, anyhow) can be tested already for the breakthrough experiments with just five qubits by Martinis at als reported in this paper: (Again, both for the real experimental level and the detailed simulation level). What is needed is to understand the probabilities of simultaneous k-qubits failures for sets of *k*-qubits *k*=2,3,4,5.) Of course, this is much more easily said than done, and we need to formalize what we want to test more precisely, but maybe it can be done already with the existing experimental apparatus. (I would expect a strong positive correlation which will lead to high probability for “synchronized errors.”)

Second, John, in connection with the varietal dynamics that you mentioned several times, and approximations based on low-dimensional algebraic varieties in a high dimensional Hilbert space, let me note that BosonSampling by itself is a process to reach a bosonic states which belong to a small algebraic variety of high dimensional Hilbert space. Namely, we consider the variety of decomposable degree *n* symmetric tensors with *m* variables (of dimension *nm* or so) inside the Hilbert space of all degree n symmetric tensors with m variables of dimension . E.g., for *n*=10 and *m*=20 we consider the 200-dimensional algebraic variety (of decomposable symmetric tensors parametrized by 10 by 20 complex matrices) inside a 20,000,000 dimensional Hilbert space (symmetric tensors). (But for 3 bosons the dimension of the variety is only roughly half that of that of the Hilbert space.) The ability of our experimental process to “hit” closely this low-dimensional variety is already a reason for concern. The mere fact that we have a small-dimensional variety does not imply that polynomial-time approximations are possible. But it is possible that, in every such situation, small-degree polynomials in the the Tangent space to the variety allows already good approximation for realistic noisy quantum systems which are approximately supported in such a variety (or approach it).

Here is a picture with Nikhil from the Simons Institute ]]>

**Gil asks Peter** “I am not sure what precisely you refer to as “anyonic materials.”

This request for clarification — asked of a first-rank physicist by a first-rank mathematician — calls to mind Vladimir Arnol’d’s lament “Every mathematician knows that it is impossible to understand any elementary course in thermodynamics.”

Similarly, from a mathematician’s point-of-view — and equivalently, from a computational simulationist’s point-of-view — what is “impossible to understand” about anyonic dynamics is *NOT* the topologically protected dynamical phases that are associated to braided trajectories of anyons, but rather, the mechanisms by which coarse-grained anyonic dynamics (by postulate) arises from fine-grained Maxwell-Dirac dynamics, and is (postulated to be) robust in respect to the renormalization dynamics associated to edges, sensors, defects, and vacuua.

Survey articles like Elliot Lieb’s “Quantum Mechanics, the Stability of Matter and Quantum Electrodynamics” (2004, arXiv:math-ph/0401004) can assist mathematicians toward an appreciation that

“Relativistic kinematics plus quantum mechanics is a ‘critical’ theory (in the mathematical sense). This fact will plague any relativistic theory of electrons and the electromagnetic field – primitive or sophisticated … it does not seem possible to keep to the current view that the Hilbert space is a simple tensor product of a space for the electrons and a Fock space for the photons … Despite the assertion that quantum mechanics and quantum field theory are gauge invariant, it seems to be essential to use this [the Coulomb] gauge, even though its relativistic covariance is not as transparent as that of the Lorentz gauge. … The Coulomb gauge, which puts in the electrostatics correctly, by hand, so to speak, and allows us to minimize the total energy with respect to the [gauge] field, is the gauge that gives us the correct physics and is consistent with the ‘quintessential quantum mechanical notion of a ground state energy’.”

These considerations remind us that the restriction of quantum dynamics to QED flows can be problematical for mathematicians and experimentalists alike.

In particular, QED dynamics compels us accept renormalization effects and long-range large- gauge interactions — as manifest in flux tubes & image charges, for example — that can be severely problematic for quantum computation (as the experience of recent years has taught us). On the other hand, these same QED effects act generically to ease the task of PTIME simulation. Evidently the QED-restricted gauge-invariant long-range quantum dynamics that Nature imposes upon us has very special properties, that can either hurt us or help us, depending upon how we exploit them.

The wonderful articles of the *Royal Society’s* recent theme issue “The New SI Based on Fundamental Constants” (*Phil. Trans. R. Soc. A*, October 28, 2011; 369 (1953)) can be read, with a mathematical eye, as the physics embodiment of André Weil’s paean (written to his sister, the philosopher Simone Weil)

We [mathematicians] would be badly blocked if there were no bridge between the two [integers versus rational functions]. AndvoilàGod carries the day against the devil: this bridge exists; it is the theory of algebraic function fields over a finite field of constants.

Similarly, the *Royal Society’s* various “New SI” articles show us that in regard to *abelian* anyonic dynamics

“Voilà, God has carried the day against the devil; the bridge between quantum information theory and experimental practice exists; it is the (seeming!) invariant reproducibility, unbounded precision, and robust practicality of the quantum metrology triangles that are founded upon abelion anyon dynamics.”

A great open question is: Can God similarly carry the day in regard to *nonabelian* anyon dynamics, against a devil who restricts human technologies (and human physiologies) to QED dynamical flows? For example, are topologically protected QED-restricted nonabelian anyonic quantum memories feasible, both in principle and in practice?

No one knows … and our present understanding is too feeble (as it seems to me) to foresee the answer with any great confidence.

One reasonable step toward resolving these questions — a step that is has not as yet been taken — is to mathematically naturalize the experimentally robust ** mises en pratique** of present-day QED-restricted

**Conclusion** There’s plenty of scope for Peter Shor and Gil Kalai to work *together* in advancing our present far-from-complete and far-from-natural understanding of anyonic dynamics, both abelian and nonabelian.

Needless to say, as a medical researcher I have a vested interested in ever-more-natural mathematical understanding of quantum dynamical systems, and ever-more-efficient simulation of them, and ever-more-robust *mises en pratique* for observing-and-controlling such systems. This is because we can confidently anticipate that the 21st century’s transformational advances in quantum information theory and quantum computing will be accompanied by transformational advances in medical technology and healing. So let’s keep advancing!

In the language of Serre and Grothendieck, the QIST Roadmap can be appreciated as *le dessein du marteau et du burin* (“the programme of the hammer and chisel”), whereas RKC/RKP varietal dynamics can be appreciated as *le dessein de la marée montante* (“the programme of the rising varietal tide”),

Needless to say, in quantum information theory as in algebraic geometry, it’s reasonable to purse *both* approaches, as history shows us plainly, and common sense affirms, that each strengthens the other.

**Knuth’s Criterion** (1996) Science is what we understand well enough to explain to a computer. Art is everything else we do.

**Shor’s Postulate** (see above) “The impossibility of fault-tolerant quantum computation must mean that anyonic materials behave very differently from what quantum mechanics predicts.”

It is instructive and fun (and perhaps even practically useful) to consider *restrictions* of the above two statements, as follows:

Restricted Knuth Criterion (RKC)RKC scienceis (by definition) what we understand well enough to computationally simulate with PTIME resources. Art is everything else we do.

Restricted Kalai Postulate (RKP)Quantum metrology triangles and anyonic dynamical systems both are RKC science

The point is that it may not be neither computationally necessary, nor computationally feasible, nor even computationally desirable to embrace the goals that some crusaders advocate:

[We want to] “lift the enormity of Hilbert space out of the textbooks, and rub its full, linear, unmodified truth in the face of anyone who denies it.”

Rather, it may be preferable — in regard to advancing mathematics, science, engineering (and even medicine) — to instead progressively raise tide of RKC science to submerge the most lustrous crown jewels of quantum physics, that is: (1) abelian quantum metrology triangles of invariant reproducibility and unbounded precision, followed by (2) *nonabelian* metrology triangles, *also* of invariant reproducibility and unbounded precision.

Relative to the 2003-4 QIST Roadmap — which regrettably has neither been peer-assessed nor updated — the RKC/RKP *dessein du marée montee* is (as it seems to me) more mathematically natural, more experimentally verifiable, of greater engineering consequence, more stimulating of enterprise, more feasible in scope, more incremental in execution, more readily initiated, more objectively assessable, more respectful of urgent strategic and humanitarian requirements, more compatible with international agreements of long standing, and (most important of all) more inspirational to young researchers.

Needless to say, in this regard it is neither necessary, nor feasible, nor desirable that everyone think alike. Indeed the RKC/RKP *dessein du marée montee* already is well-underway, and is immersing quantum physics in a rising global appreciation that the Kalai Postulates, for at least the next few decades, may pragmatically be regarded as being *effectively* correct.

Thanks for the good comment. Your comment consists of a paragraph on your view and continues with a few sentences on my thinking. Let me respond in two stages. First let me rephrase what you wrote for the bosonic case (which I studied in more details), and then I will move back to anyons.

**The bosonic analog**

(A’) It seems to me that bosons lead easily enough to BosonSampling which demonstrates “quantum supremacy, ” so the impossibility of (computationally superior) BosonSampling must mean that bosonic materials behave very differently from what quantum mechanics predicts.

Yes, realistic (noisy) bosonic states behave very differently from what quantum mechanics predicts for noiseless bosonic states.

(B’) However, your thinking seems to be that bosonic materials might work exactly the same as quantum mechanics predicts they will, except for the fact that, if anybody actually manages to control the bosons well enough to demonstrate (superior) quantum computation, the computation will fail for mysterious reasons.

I will rephrase the “my thinking” part in a stronger and simpler way as follows:

(C’) Nobody (nature included) can control bosonic states “well enough” to demonstrate superior quantum computation.

For the bosonic case (more precisely for states based on random gaussian matrices) I am working out the situation with Guy Kindler and our work suggests the following more concrete predictions.

(P’1) **[noise-sensitivity]** Even for small amount of noise in the creation process of the bosonic state there will be a very large difference (in terms of -norm) in the behavior of the ideal bosonic state and the noisy bosonic state.

(P’2)** [small support]** Realistic (noisy) bosonic states are supported by low degree Hermite-polynomials.

(P’2) means that for a fixed amount of noise the noisy versions have polynomial-time approximation.

**Back to anyons**

(A) “It seems to me that anyons lead to fault-tolerant quantum computation easily enough that the impossibility of fault-tolerant quantum computation must mean that anyonic materials behave very differently from what quantum mechanics predicts.”

Yes, like in the case of bosonic states small amount of noise in the process leading to “anyonic materials” will cause them to behave very differently than the ideal model. Both these behaviors are consistent with “what quantum mechanics predicts.”

(B) However, your thinking seems to be that anyonic materials might work exactly the same as quantum mechanics predicts they will, except for the fact that, if anybody actually manages to control the anyons well enough to do quantum computation, the computation will fail for mysterious reasons.

My thinking is that

(C) N**obody** (nature included) can “control” the anyons well enough to demonstrate quantum-fault tolerance and do quantum computing.

((C) is both stronger and clearer than (B).)

Of course, you may ask, why don’t I have as concrete predictions for the anyonic case as I have for the bosonic case. I can make a few excuses: the mathematics and physics are deeper and more complicated. The implementation is more mysterious, for example, I am not sure what precisely you refer to as “anyonic materials.” Is it a hypothetical theoretical gadget, Peter, or do you refer really to some materials? Do you regard the implementation of surface code based on superconducting qubits an “anyonic material?” And finally, I simply did not study the anyonic case in detail.

We can expect to have predictions which are similar to (P’1) and (P’2) in the bosonic case, namely:

(P1)** [Noise-sensitivity]** Small amount of noise in the creation process of anyonic states leads to very large difference between the noisy states and the ideal states.

(P2) **[Low-degree support]** Realistic noisy anyonic states are supported by “low degree” representations and have polynomial-time approximations.

But unlike the bosonic state I do not have results supporting these predictions, and frankly I don’t understand the representation theory well enough to state (P2) in precise mathematical terms, so this is left to be explored.

As I said earlier, the *simulation heuristic* (which I regard as strong while not ironclad) applies both to anyonic and bosonic computation. (The application for anyons is stronger and more direct.) The simulation heuristics gives a clear prediction that anyonic-based qubits are noisy.

Let me try to explain what I mean when I say your theory doesn’t make any predictions.

It seems to me that anyons lead to fault-tolerant quantum computation easily enough that the impossibility of fault-tolerant quantum computation must mean that anyonic materials behave very differently from what quantum mechanics predicts.

However, your thinking seems to be that anyonic materials might work exactly the same as quantum mechanics predicts they will, except for the fact that, if anybody actually manages to control the anyons well enough to do quantum computation, the computation will fail for mysterious reasons.

So the only thing that you are predicting from the assumption of no fault-tolerant quantum computation is that quantum computation won’t work. ]]>