Ringel Conjecture, Solved! Congratulations to Richard Montgomery, Alexey Pokrovskiy, and Benny Sudakov

Ringel’s conjecture solved (for sufficiently large n)

A couple weeks ago and a few days after I heard an excellent lecture about it by Alexey Pokrovskiy in Oberwolfach, the paper A proof of Ringel’s Conjecture by Richard Montgomery, Alexey Pokrovskiy, and Benny Sudakov was arXived. Congratulations to Richard, Alexey, and Benny!

Here is the abstract:

A typical decomposition question asks whether the edges of some graph G can be partitioned into disjoint copies of another graph H. One of the oldest and best known conjectures in this area, posed by Ringel in 1963, concerns the decomposition of complete graphs into edge-disjoint copies of a tree. It says that any tree with n edges packs 2n+1 times into the complete graph K_{2n+1}. In this paper, we prove this conjecture for large n.

Previous posts: Test your intuition 43 – What will be the distribution of anabraists and algasisits; MIP*=RE.

Absorption and distributive absorption

First the authors construct approximate decomposition (they divide the problem into two cases) and then they apply a powerful method called absorption. Here is what they write:

To prove our finishing lemmas, we use an absorption strategy. Absorption has its origins in work by Erdős, Gyárfás and Pyber [5] and Krivelevich [14], but was codified by Rödl, Ruciński and Szemerédi [20] as a versatile technique for extending approximate results into exact ones. For both Case A and Case B we use a new implementation of distributive absorption, a method introduced by the first author in [16].

Absorption is one of the most powerful current methods in extremal combinatorics. It is important in the construction of designs (See here, here, here, and here, and my own survey), and in the the solution of several other important conjectures.

Here are links to the pioneering papers mentioned above.

P. Erdős, A. Gyárfás and L. Pyber [5] Vertex coverings by monochromatic cycles and trees. Journal of Combinatorial Theory, Series B, 90–95, 1991.

M. Krivelevich [14] Triangle factors in random graphs. Combinatorics, Probability and Computing, 6: 337–347, 1997.

V. Rödl, A. Ruciński and E. Szemerédi [20] A dirac-type theorem for 3-uniform hypergraphs. Combin. Probab. Comput., 15:229–251, 2006.

And Montgomery’s paper on distributive absorption:

R. Montgomery [16] Spanning trees in random graphs, Advances in Mathematics, 2019

A related paper with an amusing name and an amusing abstract is:  E. Szemerédi “Is laziness paying off? (`Absorbing’ method).”  The abstract of the paper reads “We are going to mention some embedding problems. The main ideas of most of the solution of these problems are that do not work too much. Finish the job almost, then use some kind of absorbing configuration (that is what we call laziness).”

Montgomery, Pokrovskiy, and Sudakov’s paper concludes with two related famous conjectures:

The graceful tree conjecture

Motivated by Ringel’s conjecture, in 1967 Alexander Rosa conjectured that every tree T with n edges admits a graceful labeling, namely a labeling of the vertices by the numbers 0,1,…,n such that if we label an edge by the absolute value of the difference of the labels for its two vertices, then all the edge labels are different!

This famous conjecture (stronger than Ringel’s conjecture) is known as the Graceful Tree conjecture and also as the Ringel-Rosa-Kötzig conjecture. It is still open.

Packing trees of different sizes: Gyárfás conjecture

Gyárfás conjecture (1976): Let T_1, \dots ,T_n be trees with T_i having i vertices for each i=1,2,\dots,n. Then the edges of K_n can be decomposed into n trees which are isomorphic to T_1, \dots ,T_n respectively.


Posted in Combinatorics, Open problems, Updates | Tagged , , | Leave a comment

Test your intuition 43: Distribution According to Areas in Top Departments.


In the community of mamathetitians in a certain country there are mamathetitians in two areas: Anabra (fraction p of the mamathetitians) and Algasis (fraction 1-p of  mamathetitians.) There are ten universities with 50 faculty members in each mamathetics department and there is a clear ranking of these universities from best to worse. (So a better university pays higher salary and gives better working conditions.)  Every year 40 fresh Ph D’s are in the job market and each university  hires two of the fresh Ph. D.’s as new faculty members replacing retired faculty members.

The objective value v(x) of every mamathetitian x is a random variable uniformly distributed in the interval [0,1]The subjective value of x in the eyes of another mamathetitian y is v(x)+t if they belong to the same area and v(x)-t if they belong to different areas, where t is a small real number.

The top ranked university hires the best two fresh Ph. D.’s  according to average value in the eyes of the faculty, the second university hires the second best two students etc.

Let’s assume that to start is p=0.25, so 25% of all faculty are anabraists, and 75% are algasisits and this is the ratio in every university. Let us also assume that among the students for ever 25% are anabraists, and 75% are algasisits.

Test your intuition (and/or, programming skills) 43: What will be the distribution of anabraists and algasisits in the departments as time advances (depending on t), what will be the stationary distribution across departments?

We can also ask about variations to the model:

Further test you intuition, imagination and programming skill: How does the answer change if you change the model in various ways. Change the value of p? Make the initial distribution random across departments?  Add noise to the subjective value? Make the bias in subjective evaluation asymmetric? make different assumptions on the fresh Ph. D.s? etc. etc.

References to similar models considered in the literature are most welcome.

Disclaimer: I do not know the answers.

Outcomes by Joshua Paik

Here are very interesting outcomes by Joshua Paik. We see a sort of Zebra-strips patterns between Algasis dominated departments and Anabra dominated departments.

Posted in Combinatorics, Open problems, Probability, Test your intuition | Tagged | 9 Comments

Two talks at HUJI: on the “infamous lower tail” and TOMORROW on recent advances in combinatorics

In this post I advertise my colloquium lecture tomorrow –  Thursday 23/1/2020 14:30 – on recent advances in combinatorics, and also mention Wojtek Samotij’s lecture on our combinatorics seminar on The lower tail for triangles in random graphs. Click here for the previous post on MIP*=RE

Artist: Heidi Buck. “Catch a Dragon by the Tail” ( source )

A few months ago we briefly described the solution by to the infamous upper tail problem for random graph. More recently, Gadi Kozma and Wojciech Samotij solved the infamous lower tail problem. 

My colloquium talk on Thursday January 23: Recent Advances in Combinatorics

Remark: I chose only a few advances only from the last few months and mainly in probabilistic and extremal combinatorics. In the lecture I will briefly mention others as well.

Colloquium: Gil Kalai (HUJI) – Some recent advances in combinatorics
I will discuss some recent advances in combinatorics, among them the disproof of Hedetniemi conjecture by Shitov, the proof of the sensitivity conjecture by Huang, the progress on the Erdos-Rado sunflower conjecture by Alweiss, Lovett, Wu, and Zhang, and the progress on the expectation threshold conjecture by Frankston, Kahn, Narayanan, and Park.

Thu, 23/01/2020 – 14:30 to 15:30

Manchester Building (Hall 2), Hebrew University Jerusalem

Combinatorics seminar last Monday: Kozma and Samotij solved the infamous lower tail problem for random graphs.

Combinatorics Seminar HUJI

When: Monday Jan 20, 10:00-11:45
Where: C-400, CS building

Speaker: Wojciech Samotij (TAU)

Title: The lower tail for triangles in random graphs

Let X denote the number of triangles in the random graph G_{n,p}. The problem of determining the asymptotics of the logarithmic lower tail probability of~X, that is, the function f_c(n,p) = - \log Pr(X \le c \mathbb E [X]), for c \in [0,1), has attracted considerable attention of both the combinatorics and the probability communities. In this talk, we explain how one can bound f_c(n,p) from above by solving a natural combinatorial optimisation problem that generalises Mantel’s / Turán’s theorem. Moreover, we will prove a matching lower bound on f_c(n,p) in the case p = 1/2. Our argument, which employs ideas from an earlier joint work with Gady Kozma, Tom Meyerovitch, and Ron Peled, exploits connections between entropy and independence.

This is joint work with Gady Kozma.

Posted in Combinatorics, Updates | Leave a comment

Amazing: Zhengfeng Ji, Anand Natarajan, Thomas Vidick, John Wright, and Henry Yuen proved that MIP* = RE and thus disproved Connes 1976 Embedding Conjecture, and provided a negative answer to Tsirelson’s problem.

A few days ago an historic 160-page paper with a very short title MIP*=RE was uploaded to the arXive by Zhengfeng Ji, Anand Natarajan, Thomas Vidick, John Wright, and Henry Yuen.  I am thankful to Dorit Aharonov and Alon Rosen for telling me about it. The paper simultaneously settles several major open problems in quantum computational complexity, mathematical foundations of quantum physics, and mathematics. Congratulations to Zhengfeng, Anand, Thomas, John, and Henry!

A tweet by Ryan

The new paper dramatically improved the 2019 result by Anand Natarajan, and John Wright asserting that  “NEEXP in MIP*”.

In this post I will talk a little about two corners of this work and neglect many others:  I will describe a few conjectures about infinite groups related to the new result, and I will give a very gentle beginning-of-an-introduction to interactive proofs.  I will also give some useful links to papers, presentations and blog posts.

Boris Tsirelson is one of my mathematical heroes. The new paper gives a negative answer to an important problem posed by Tsirelson. (Here is a great interview with Boris.)


My Qstate: a master project (Vidick’s memories of the road since Julia Kempe offered him the problem 14 years ago; with a lot of scientific content). Older related posts on My Qstate  I, II, III.

A Notices AMS article by Vidick: From Operator Algebras to Complexity Theory and Back.

Shtetl Optimized: MIP*=RE;  GLL (Ken Regan) Halting Is Poly-Time Quantum Provable ; Other posts on blogs: WIT (Boaz Barak); CC (Lance Fortnow); WN; Posts on an earlier 2019 result  MIP* contains NEEXP. Updates: A post on GLL on halting, two excellent posts in Hebrew I, II.

Quanta Magazine: An article by Kevin Hartnett about an earlier result MIP*=NEEXP; An article about the new result.

Older posts here:  about Vidick- 2012 paper (among various updates); a 2008 post mentioning sofic groups (among various updates);

Videotaped lectures: from our recent winter school Thomas Vidick on quantum protocols Video 1, Video 2, Video3.

A mathematical context (of one corner of the work) and wish list: stability theory for groups.

(I am thankful to Alex Lubotzky for telling me about the algebra background.)

Links: Finitary approximations of groups and their applications, by Andreas Thom, Andreas’ ICM 2018 videotaped lecture. And a great video: Best of Andreas Thom. See also this paper Stability, cohomology vanishing, and non-approximable groups by Marcus De Chiffre, Lev Glebsky, Alex Lubotzky, and Andreas Thom.

And (thanks Mikael de la Salle!) a recent Book by Gilles Pisier:
Tensor products of C*-algebras and operator spaces; The Connes-Kirchberg problem

The assertion of Connes’ embedding conjecture refuted in the MIP*=RE paper would imply several (outrageous :)) stronger conjectures that are still open. One is the conjecture of Connes that every group is “hyperlinear.” Another famous conjecture (an affirmative answer to a question posed by Gromov) is  that every group is sofic. As sofic groups are hyperlinear we can now expect (ever more than before) that non-sofic and even non hyperlinear groups will be found. Here is a rough short explanation what these conjectures are about. (Kirchberg’s conjecture, is another group theoretic question of this nature.)

Every finite group is a permutation group and is a linear group. This is not the case for infinite groups and there are various interesting notions of “approximately-permutation-group” (this is “sofic”) and “approximately linear” (this is called “hyperlinear”).

Given a group Γ we want to find a sequence of functions f_n

  1. From Γ to symmetric groups S_n,
  2. or from Γ to the unitary groups U(n).

Such that asymptotically as n grows these functions are “almost homomorphisms” with respect to certain metrics DIST on S_n or U(n) respectively. This means that for every two elements

a,b \in \Gamma,

DIST( f_n(ab), f_n(a)f_n(b)) tends to zero when n tends to infinity.


  1. Sofic group refers to the normalized Hamming metric for symmetric groups.
  2. Hyperlinear group refers to metrics given by the normalized Hilbert-Schmidt norm on the unitary groups
  3. MF-groups, Again the unitary group but the operator norm this time.

And there are various other metrics that were also considered. The assertion of the famous embedding conjecture by Connes on von-Neumann algebras (now refuted by the new result) implies that every group is hyperlinear.

A remaining wish list: Find a non sofic group; find a non-hyperlinear group; refute  Kirchberg’s conjecture (if it was not already refuted).

Interactive proofs and some computational complexity background.


Links: here are slides of  a great talk by Yael Kalai: The evolution of proof in computer science; an a blog post on this topic by Yael Kalai, and a post here about Yael’s 2018 ICM paper and lecture.

A decision problem is in P if there is a polynomial time algorithm (in terms of the input size) to test if the answer is yes or no.  A decision problem is in NP if there is a proof for a YES answer that can be verified in a polynomial time.

Here are two examples: The question if  graph has a perfect matching is in P. The question if  graph has an Hamiltonian cycle  is in NP. If the answer is yes a prover can give a proof that requires the verifier a polynomial number of steps to verify.

IP is a complexity class based on a notion of interactive proof where, based on a protocol for questions and answers,  the prover can convince the verifier (with arbitrary high probability) that the answer is yes. Following a sequence of startling developments Adi Shamir proved that IP is quite a large complexity space P-space.  When we consider several non-interacting provers (two provers suffice) the computational power denoted by MIP is even larger: László Babai, Lance Fortnow, and Cartsen Lund proved that MIP=NEXP!  NEXP is the class of decision problems where if the answer is yes a prover can give a proof that requires the verifier (at most) an exponential number of steps to verify.

Enters quantum computation and entanglement

We replace the model of classical computation with quantum computation. Each of the two provers, Prover1 and Prover2, have access to separate sets of m qubits but they  can prepare in advance a complicated quantum state on those 2m qubits. When we run the verification protocol each prover has access only to its m qubits and, like in the classical case,  the two provers cannot communicate. These types of verification protocols represent the complexity class MIP*.  In 2012 and Tsuyoshi Ito and Thomas Vidick proved that MIP* contains NEXP. In this 2012 post I reported an unusual seminar we ran on the problem.

Interactive quantum lecture: We had an unususal quantum seminar presentation by Michael Ben-Or on the work A multi-prover interactive proof for NEXP sound against entangled provers by Tsuyoshi Ito and Thomas Vidick. Michael ran Vidick’s videotaped recent talk on the matter and from time to time he and Dorit acted as a pair of prover and the other audience as verifier. (Michael was one of the authors of the very first paper introducing multi-prover interactive proofs.)

Let me mention also a related 2014 paper by Yael Kalai, Ran Raz, and Ron Rothblum: How to delegate computations: the power of no-signaling proofs. They considered two provers that are limited by the “non-signaling principle” and showed that the power of interactive proofs is exactly EXP . (Here is a videotaped lecture by Ran Raz.)

In April 2019, Anand Natarajan and John Wright uploaded a paper with a proof that MIP* contain NEEXP. (NEEXP is the class of decision problems where if the answer is yes a prover can give a proof that requires the verifier (at most) doubly exponential number of steps to verify.)

Here is a nice quote from the Harnett’s  quanta paper regarding the Natarajan-Wright breakthrough:

Some problems are too hard to solve in any reasonable amount of time. But their solutions are easy to check. Given that, computer scientists want to know: How complicated can a problem be while still having a solution that can be verified?

Turns out, the answer is: Almost unimaginably complicated.

In a paper released in April, two computer scientists dramatically increased the number of problems that fall into the hard-to-solve-but-easy-to-verify category. They describe a method that makes it possible to check answers to problems of almost incomprehensible complexity. “It seems insane,” said Thomas Vidick, a computer scientist at the California Institute of Technology who wasn’t involved in the new work.

Now with the new result, I wonder if this bold  philosophical interpretation is sensible:  There is a shared quantum state that will allow two non-interacting provers (with unlimited computational power) to convince  a mathematician if a given mathematical statement has a proof, and also to convince a historian or a futurist about any question regarding the past or future evolution of the universe.

What is RE?

(I forgot to explain what RE is. Here is the description from the paper itself.)

RE is the class of recursively enumerable languages, i.e. languages L such that there
exists a Turing machine M such that x ∈ L if and only if M halts and accepts on input x. Note that, in addition to containing all decidable languages, this class also contains undecidable problems such as the Halting problem, which is to decide whether a given Turing machine eventually halts

A little more information and links

The negative answer to Tsirelson problem asserts roughly that there are types of correlations  that can be produced by an infinite quantum systems, but that can’t even be approximated by a finite system. Connes’ 1976 embedding conjecture (now refuted) from the theory of von Neumann algebras asserts that “Every  type \Pi_1 von Neumann factor embeds in an ultrapower of a hyperfinite \Pi_1 factor.”

The abstract of the new paper mentions a few other works that are important for the new proof.

Anand Natarajan and Thomas Vidick. Low-degree testing for quantum states, and a quantum entangled games PCP for QMA. (FOCS, 2018.)

Anand Natarajan and John Wright. NEEXP ⊆ MIP∗ (FOCS 2019) (We mentioned it above.)

Joe Fitzsimons, Zhengfeng Ji, Thomas Vidick, and Henry Yuen. Quantum proof systems
for iterated exponential time, and beyond.

The abstract also mentions two papers about the connections with Tsirelson problem and Connes embedding conjecture

Tobias Fritz. Tsirelson’s problem and Kirchberg’s conjecture. Reviews in Mathematical
Physics, 24(05):1250012, 2012. (A few enlightening comments by Fritz on SO: I, II)

Marius Junge, Miguel Navascues, Carlos Palazuelos, David Perez-Garcia, Volkher B Scholz, and Reinhard F Werner. Connes’ embedding problem and Tsirelson’s problem. Journal of Mathematical Physics, 52(1):012102, 2011.

Let me also mention

Narutaka Ozawa. About the Connes embedding conjecture. Japanese Journal of Mathematics, 8(1):147–183, 2013.

Posted in Algebra, Analysis, Combinatorics, Computer Science and Optimization, Physics, Quantum | Tagged , , , , | 11 Comments

Do Not Miss: Abel in Jerusalem, Sunday, January 12, 2020

From left: Christopher Hacon, Claire Voisin, Ulrike Tillmann,  François Labourie

Update: This was a great event with four great inspiring talks.

Abel in Jerusalem, January 12, 2020

The Einstein Institute of mathematics is happy to host the Abel in Jerusalem Conference

Abel in Jerusalem” will be the 10th one-day conference with lectures aimed at a mathematically educated and interested audience, with the objective of increasing public awareness of mathematics and of the Abel Prize.


9:40-10:00   Opening Remarks by Hans Petter Graver, President of the Norwegian Academy of Science and Letters

10:00-11:00    Christopher Hacon (University of Utah): Geometry of complex algebraic varieties    Abstract

11:00-11:30    Coffee Break

11:30-12:30    Ulrike Tillmann (Oxford University): Manifolds via cobordisms: old and new    Abstract

12:30-14:30    Lunch Break

14:30-15:30    Claire Voisin (Collège de France): Diagonals in algebraic geometry    Abstract

15:30-16:00    Coffee Break

16:00-17:00    François Labourie (Université Côte d’Azur): Counting curves and building surfaces: some works of Maryam Mirzakhani    Abstract

17:00-17:15    Closing remarks

17:15 – Transport to a reception at the King David Hotel
17:30 – Reception
19:00 – End of program


Christopher Hacon: Geometry of complex algebraic varieties

Abstract: Algebraic varieties are geometric objects defined by polynomial equations. The minimal model program (MMP) is an ambitous program that aims to classify algebraic varieties. According to the MMP, there are 3 building blocks: Fano varieties, Calabi-Yau varieties and varieties of general type which are higher dimensional analogs of Riemann surfaces of genus 0, 1, and greater or equal to 2. In this talk I will recall the general features of the MMP and discuss recent advances in our understanding of Fano varieties and varieties of general type.

Ulrike Tillmann: Manifolds via cobordisms: old and new

Abstract: Manifolds are a fundamental mathematical structure of central importance to geometry. The notion of cobordism has played an important role in their classification since Thom’s work in the 1950s. In a different way, cobordisms are key to Atiyah’s axiomatic formulation of topological quantum field theory. We will explain how the two seemingly unrelated appearances of cobordisms have come together to give us a new approach to study the topology of manifolds and their diffeomorphisms. In addition to my own work, the talk will draw on results by Madsen, Weiss, Galatius and Randal-Williams.

Claire Voisin: Diagonals in algebraic geometry

Abstract: The diagonal of a manifold appears naturally in topology, for example in the Hopf formula. Furthermore, the Künneth decomposition for the class of the diagonal controls the torsion in integral cohomology. In the context of algebraic geometry, I will discuss a weaker notion of decomposition of the diagonal, which has important applications in the study of rationality of algebraic varieties.

François Labourie: Counting curves and building surfaces: some works of Maryam Mirzakhani

Abstract: I will use some works of Maryam Mirzakhani as a thread to explain very basic and elementary facts of geometry : how to build surfaces, how to count curves on surfaces, what is hyperbolic geometry. The talk will be elementary and most of it will be targeting undergraduate students.



Gil Kalai
Tamar Ziegler

Posted in Algebra, Conferences, Geometry | Tagged , , , | Leave a comment

The Brown-Erdős-Sós 1973 Conjecture

Greetings from Oberwolfach.  This week, there is a great meeting here on combinatorics. In this post I want to state the Brown-Erdős-Sós conjecture and one of its variants. The trigger was a beautiful talk I heard from Lior Gishboliner on some impressive progress toward this very difficult problem.  Yet the solution of the problem seems far, and some people even doubt if the conjecture is true. Here is the paper “A New Bound for the Brown-Erdős-Sós problem” by David Conlon, Lior Gishboliner, Yevgeny Levanzov, and Asaf Shapira.

The Brown-Erdős-Sós conjecture and strong versions

Lior’s talk reminded me that Peter Frankl and Voita Rodl told me in the late 80s about strong variants of the Brown-Erdős-Sós problem that implies the Szemeredi theorem. With the kind help of Voita Rodl  who is here I can tell you about two such variants. (But I am solely responsible for all mistakes.)

Let f(n,v,e) be the largest number of edges in an 3-uniform hypergraph with n vertices and no v vertices spanning e edges or more.

Brown-Erdős-Sós Conjecture: f(n,e+3,e)=o(n^2), for every e\ge 3.

For e=3 the BES conjecture is the famous Ruzsa-Szemeredi theorem (that implies Roth’s theorem on 3-term arithmetic progressions). For e=4 the conjecture is painfully open (and is stronger than Szemeredi theorem for 4-term APs).

Here is an even stronger conjectures that implies Szemeredi theorem arithmetic progressions of arbitrary length.

Consider the following 3-uniform hypergraph H(v) with v+3 vertices and v edges: Start with a path with v+1 vertices (and v edges). Color the edges alternately with red and brown and add a red vertex and a brown vertex. Now form a 3-uniform hypergraph whose edges are the red edges of the path joined with the red vertex and the brown edges of the graph joined with the brown vertex.

Conjecture: Every H(v)-free 3-uniform hypergraph with n vertices has o(n^2) vertices.

For another strengthening of the BES-conjecture that implies the Szemeredi theorem see the paper A remark of Pisier-type theorems of Erdos, Nesetril and Rodl. Famously, the celebrated hypergraph regularity lemma (and associated counting lemma) implies another Turan type theorem for hypergraphs that in turn implies Szemeredi’s theorem and its high dimensional extensions.

The new bounds by Conlon, Gishboliner, Levanzov, and Shapira

Although the theorem is for 3-uniform hypergraph the proof relies on the hypergraph regularity lemma for higher dimensional hypergraphs.

With Voita in 1985 (or 86) and today (2020)

Posted in Combinatorics | Tagged , , , , , , , , , , , , , , | 2 Comments

Tomorrow: Boolean functions day at the TAU theory fest

As part of the 2019/2020 TAU theory fest, tomorrow, Friday, January 3, 2020,  is a Boolean function day at Tel Aviv University. The five speakers are Esty Kelman, Noam Lifschitz, Renan Gross, Ohad Klein, and Naomi Kirshner.

For more (and probably not all) mathematical activities in Israel these days including Abel in Jerusalem see this post.

The Abel in Jerusalem site has now the titles and abstracts of all talks. The talks are aimed so that advanced undergraduate students can learn from them and enjoy them. (The conference celebrating Boris Solomyak’s birthday was just added to the post on current math activities in Israel.)

Windows in theory reports from the theory fest on an exciting scientific bet between Elchanan Mossel and Subhash Khot. (Raising again the question if scientific bets are good.) Also in Windows in theory a report by Dorit Aharonov on the winter school on mathematics of quantum computing at HUJI.

Posted in Combinatorics, Conferences, Convex polytopes | Tagged , , , , , , , | Leave a comment

The Google Quantum Supremacy Demo and the Jerusalem HQCA debate.

Below are 10 annotated slides from a spontaneous informal talk that I gave at the school on mathematics of quantum computing a weak ago. (Power point presentation.) Later in the afternoon we had  a panel/debate on quantum supremacy (click for the video) moderated by Sandy Irani and featuring Scott Aaronson, Dorit Aharonov, Boaz Barak, Sergio Boixo, Adam Bouland,  Umesh Vazirani, and me. It was a thoughtful and interesting discussion. (The presentation was initially prepared for the afternoon debate but at the end I gave it as a separate talk and presented just one slide at the panel itself. The plan was also to post it as a background material before the discussion, but not untypically, I was too slow, so here it is a week later.) I am thankful to Dorit for inviting me to the panel (and for organizing a great school!), to Sandy for her excellent moderation, and to all the panelists for their good spirit. (Update: A blog post by Dorit at Windows on Theory)

My assessment continues to be that both the Google supremacy claims and their other central claim about fidelity estimation are incorrect.

(Dec 23, 2019: Following some renewed discussion in the last days on the terminology I proposed to replace the term “quantum supremacy” with “HQCA – Huge Quantum Computational Advantage” and I myself may try to follow my new term in my own lectures/papers in the future and see how it goes. In fact, I already used HQCA in our Sunday’s Kazhdan seminar where I lectured about noise stability and sensitivity for Boolean functions, Boson Sampling, and quantum circuits (lecture notes).)

My Presentation on the Google Quantum Supremacy Demonstration

Note: you may click on a slide to see it in full screen.

My lecture is not about quantum computers and quantum supremacy in general but about the Google quantum supremacy claims. (If you want to know more about my general argument against quantum computers, quantum supremacy and quantum error correction go to the previous post Gil’s Collegial Quantum Supremacy Skepticism FAQ and to several of my papers and presentations linked there.)

In this presentation I will concentrate on aspects of the Google experiment that are “too good to be true”. (Which is a bad sign, not a good sign.)

There is an amazing agreement between the fidelity as estimated from the experimental data and a very simple high-school model based on the probability of individual qubits and gates to malfunction.  (An even simplified version, below in purple, requires just three parameters – the number n of qubits, the number g1 of 1-qubit gates and the number g2 of 2-qubit gates.)

Formula (77) itself is not surprising but the terrific success of Formula (77) may serve as a smoking gun for the claim that the Google’s experiment has serious methodological problems.

I was not the only one to be amazed. Here is what John Martinis, the scientific leader of the Google supremacy experiment told about it. (Nov 1., 2019) Read carefully!

The accuracy of Formula (77) is the most amazing thing about the data, something that came as a big surprise, and that (jokingly) will let quantum scientists keep their jobs.

And here is what I think

No no, you cannot estimate with precision of 10-20% the probability of failure of a physical system with 1000 interacting elements as the product of 1000 error-probabilities.

This remarkable agreement is a major new discovery, and it is not needed for building quantum computers. It is only needed for the extrapolation argument leading the Google team to supremacy.

Even if quantum computers will be built this is something that we are not going to witness. It is interesting to check if we see anything remotely like this for the IBM quantum computers.

Here on the left you see for various values of n, the number of qubits, two experiments with different circuits that leads to entirely different probability distributions. Since each pair have a similar number of gates, Formula (77) leads to very close fidelity estimates which are so accurate that the green and blue lines coincide! We do not expect to see such close agreements in experimental data.


Blue, orange and green analytic smooth curves representing three distributions. For all three you see also empirical samples from the distributions. For two of them the samples are perfect, for the other one you see a sample based on a complicated physical experiment.

Test your intuition: can you tell the difference? Can you tell which is which?

(You can also test your intuition which was the single slide presented at the afternoon debate.)

Blind tests are standard in scientific experiments and are quite easy to implement here.

The meaning of the surprising statistical success of Formula (77) should be carefully examined.

Two extra slides

To put my view in context,  I devote one slide for my general argument against quantum computers, quantum supremacy and quantum error correction.

I mentioned that most experts do not agree with my argument or do not even understand it, and for some of them, this may give reasons for skepticism also about my critique on the Google experiment. I  personally think that my general argument is good, but  I don’t think it is ironclad and I am pleased to see this matter explored experimentally (properly). I also think that my critique on the Google demo stands well on its own.

And what may convince me to change my mind? poetry!

Let me add that all my critique as described here and elsewhere was first presented to the Google team.

From right to left: Sandy Irani, me, Boaz Barak, Adam Bouland, Dorit Aharonov, and on the left Umesh Vazirani, Scott Aaronson, and Sergio Boixo. (Collage; More pictures – below.)

Continue reading

Posted in Controversies and debates, Quantum | Tagged , , , , , , , , , , | 2 Comments

Four Great Numberphile Graph Theory Videos

A quick link to our previous post: Gil Bor, Luis Hernández-Lamoneda, Valentín Jiménez-Desantiago, and Luis Montejano-Peimbert: On the isometric conjecture of Banach.


We have mentioned the Numberphile video channel before (here, and here, and here). It has amazing videos for the very general public about mathematics. And the general public is interested! The channel has more than 3 millions subscribers and a Numberphile video gets usually well over 100,000 views.  While the videos are for very general audiences it is quite common that I learn new mathematics while watching these videos. Today I mention four recent Numberphile videos on graph theory.

Before that, let me mention that, while combinatorics has a special appeal for general audiences, it is a hard challenge to explain to a very large audience (or even to the large audience of Quanta Magazine) some very abstract areas of mathematics. It is not impossible: Here is Kevin Hartnett’s article in Quanta Magazine on ∞-categories and Jacob Lurie’s work and here is a 2016 article by Erica Klarreich on perfectoid spaces and Peter Schotze’s work. I hope to see Numberphile moving into these areas as well.

A breakthrough in Graph theory:  Yaroslav Shitov’s counterexample to Hedetniemi’s conjecture as told by Erica Klarreich.

For more on this development see this post. The video contains nice pictures of Stephen Hedetniemi and his 1966 thesis. An exciting open question is: Given two graphs of chromatic number n is the chromatic number of their tensor product at least some f(n), where f(n) tends to infinity with n.

Maria Chudnovsky – perfect graphs

Starting with the definition and ending with the strong perfect graph theorem (conjectured by Claud Berge) by Chudnovsky, Seymour, Robertson and Thomas.

An exciting open question: Find a good definition for perfect r-uniform hypergraphs.

We discussed perfect graphs and related notions in this post, this one and this one.

(Remark: Numberphile misguidedly regarded this video as a continuation of the “planar graph” video. The truth of the matter is that these are separated topics and the perfect graph video should be a separate Numperphile video.)

Maria Chudnovsky – planar graphs

Planar graphs are deep and fascinating (and miraculous, and always surprising) mathematical objects.  In Hebrew we can say about the topic that it is “Olam Umelo’o” (עולם ומלואו) (=a whole world with all its content). There are many fascinating properties of planar graphs and many open problems about them.

There are also many fascinating generalizations. Let me mention Abby Thompson’s problem: Suppose that G is the graph of a simple d-polytope with n vertices. Suppose also that n is even (this is automatic if d is odd). Can we always properly color the edges of G with d colors? (For d=3, the answer is yes and this is the four color theorem.)

(We discussed planar graphs here and on MO, many times.)

Discussing vertex covers and Alcuin numbers for graphs – and a famous old puzzle. Filmed at MSRI with Annie Raymond.

We discussed here and in other posts the important notion of vertex covers, but the mathematics of the video was completely new to me.

Posted in Combinatorics | Tagged , , , , , | Leave a comment

Gil Bor, Luis Hernández-Lamoneda, Valentín Jiménez-Desantiago, and Luis Montejano-Peimbert: On the isometric conjecture of Banach

Stefan Banach and one of his famous quotes. Is it really true? A commentator (troll?) named Gina tried to challenge it in a heated discussion over the n-Category cafe.

This post briefly describes a May 2019 paper On the isometric conjecture of Banach written by Gil Bor, Luis Hernández-Lamoneda, Valentín Jiménez-Desantiago, and Luis Montejano-Peimbert. (Thanks to Imre Barany who told me about it.)

Luis Montejano-Peimbert

The isometric conjecture of Banach

Stefan Banach asked in 1932 the following question: Let V be a Banach space, real or complex, finite or infinite dimensional, all of whose n-dimensional subspaces, for some fixed integer n, 2 ≤ n < dim(X), are isometrically isomorphic to each other. Is it true that X is a Hilbert space?

The question has been answered affirmatively in the following cases.

(1) In 1935, Auerbach, Mazur and Ulam gave a positive answer in case V is a real 3 dimensional Banach space and n = 2.

(2) In 1959, A. Dvoretzky proved a theorem, from which follows an affirmative answer for all real infinite dimensional V and n ≥ 2.

(3) Dvoretzky’s theorem was extended in 1971 to the complex case by V. Milman.

(4) In 1967, M. Gromov gave an affirmative answer in case V is finite dimensional, real or complex, except when n is odd and dim(V ) = n + 1 in the real case, or n is odd and n < dim(V ) < 2n in the complex case

The new so beautiful theorem by Gil Bor, Luis Hernández-Lamoneda, Valentín Jiménez-Desantiago, and Luis Montejano-Peimbert reads

Theorem: Let B\subset \mathbb R^{n+1} , n = 4k + 1, ~ k \ge 1, n \ne 133, be a convex symmetric body, all of whose sections by n-dimensional subspaces are linearly equivalent. Then B is an ellipsoid.

The reason for the strange exception n \ne 133 has to do with the fact that 133 is the dimension of the exceptional Lie group E_7.

Gil Bor with students

E7 (Source: Wikipedea) The 126 vertices of the 231 polytope represent the root vectors of E7, as shown in this Coxeter plane projection Coxeter–Dynkin diagram:

Posted in Geometry | Tagged , , , , | Leave a comment