Ordinary computers can beat Google’s quantum computer after all

Science magazine has an article written by Adrian Cho Ordinary computers can beat Google’s quantum computer after all. It is about the remarkable progress in classical simulations of sampling task like those sampling tasks that led to the 2019 Google’s announcement of “quantum supremacy”. I reported about the breakthrough paper of Feng Pan and Pan Zhang in this post and there are updates in the post itself and the comment section about a variety of related works by several groups of researchers.

The very short summary is that by now classical algorithms are ten orders of magnitude faster than those used in the Google paper and hence the speed-up is ten orders of magnitude lower than Google’s fantastic claims. (The Google paper claims that their ultimate task that required 300 seconds for the quantum computer will require 10,000 years on a powerful supercomputers. with the new algorithms the task can be done in a matter of seconds.)

Also regarding the Google supremacy paper, my paper with Yosi Rinott and Tomer Shoham,  Statistical Aspects of the Quantum Supremacy Demonstration, just appeared in “Statistical Science”. (Click on the link for the journal version.) The Google 2019 paper and, more generally, NISQ experiments raise various interesting statistical issues. (In addition, it is important to double check various statistical claims of the paper.) One of our findings is that there is a large gap between the empirical distribution and the Google noise model. I hope to devote some future post to our paper and to some further research we were doing.

The leaking of the Google paper in September 23, 2019 led to huge media and academic attention and many very enthusiastic reactions.  I also wrote here on the blog a few critical posts about the Google claims.

Here is a figure with the price of bitcoin around the time of the Google quantum supremacy (unintended) announcement.

bc666

Update: There is a recent post on Shtetl Optimized with Scott Aaronson’s take on the supremacy situation. Overall our assessments are not very far apart. I don’t understand the claim: “If the experimentalists care enough, they could easily regain the quantum lead, at least for a couple more years, by (say) repeating random circuit sampling with 72 qubits rather than 53-60, and hopefully circuit depth of 30-40 rather than just 20-25.” In my view the most crucial task is to try to repeat and to improve some aspects of the Google experiment even for 20-40 qubits. (In any case, nothing is going to be easy.)

This entry was posted in Quantum, Updates and tagged . Bookmark the permalink.

5 Responses to Ordinary computers can beat Google’s quantum computer after all

  1. - says:

    Could the google quantum supremacy announcement have something to do with manipulating the Bitcoin price?

  2. Thanks for saving me the time to understand a paper that may not have the desired payoff.

  3. Lars says:

    This highlights the central element of pretty much all claims of quantum supremacy.
    They assume a particular classical algorithm and current supercomputer hardware to do the speed comparison.

    The fact is, any claim of quantum supremacy that is later equalled or beaten by a classical algorithm was never legitimate quantum supremacy to begin with. The length of time required to beat the quantum computer is not even relevant because it might just be that no one tried to develop a faster algorithm.
    So unless one can somehow prove mathematically that a faster classical algorithm. Is simply impossible in a particular case, it is not legitimate to claim quantum supremacy.

    The whole quantum supremacy claim is just weird.

    When classical computers were being developed, people weren’t almost daily making the claim that they held supremacy over the abacus.

    What’s the purpose, other than to generate funding?

    When (if?) A quantum computer eventually comes along that it truly “Supreme” , it will be obvious because it will be able to do something that is so advanced — like cracking cryptographic codes that are far beyond even the current standard.

  4. Andrew L says:

    One question I have is whether we can use this algorithm to analyze the raw results of the Google experiment. Regardless of whether it demonstrates quantum supremacy (as it appears from this that it does not) there’s still the question of whether it was actually computing correct results at all!

    The cross-entropy results only indicated that the answer were correct, but with this we should be able to determine whether the answers actually were correct.

  5. Matthew Cory says:

    It seems that a lot of claims were made about the abilities of quantum computers before the thermodynamics principles were ever fully understood. Is it not true that the speed of an actual reversible computer scales linearly with heat dissipation? In other words, the speed of a computational step scales linearly with applied force and energy dissipation? From the Heisenberg E-T bound and the bound on total entropy over a complete computation, a QC is bounded by (h*S^2)/(k*T), where S is the number of steps, h is Plank’s constant, k is Boltzmann’s constant and T is the ambient temperature. The runtime bounds on Grover’s and Schor’s algorithms don’t look too impressive under this basic analysis. MT and ML bounds dramatically overestimate the speed of quantum evolution.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s