Five years ago I wrote a post entitled Is Backgammon in P? It was based on conversations with Peter Bro Miltersen and Uri Zwick (shown together in the above picture) about the computational complexity of computing the values (and equilibrium points) of various stochastic games, and also on some things I learned from my game theory friends over the years about proving that values exist for some related games. A few weeks ago two former students of Peter, Rasmus Ibsen-Jensen and Kristoffer Arnsfelt Hansen visited Israel and I had a chance to chat with them and learn about some recent exciting advances.
In what way is Backgammon harder than chess?
Is there a polynomial time algorithm for chess? Well, if we consider the complexity of chess in terms of the board size then it is fair to think that the answer is “no”. But if we wish to consider the complexity in terms of the number of all possible positions then it is easy to go backward over all positions and determine the outcome of the game when we start with each given position.
Now, what about backgammon? Like chess, backgammon is a game of complete information. The difference between backgammon and chess is the element of luck; at each position your possible moves are determined by a roll of two dice. This element of luck increases the computational skill needed for playing backgammon compared to chess. It can easily be seen that optimal strategy for players in backgammon need not involve any randomness.
Problem 1: Is there a polynomial time algorithm to find the optimal strategy (and thus the value) of a stochastic zero sum game with perfect information? (Like backgammon)
This question (raised by Ann Condon in 1998) represents one of the most fundamental open problem in algorithmic game theory.
In what way is heads-up poker harder than backgammon?
Heads-up poker is just a poker game with two players. To make it concrete you may think about heads-up Texas hold’em poker. This is not a game with complete information, but by according to the minmax theorem it still has a value. The optimal strategies are mixed and involve randomness.
Problem 2: Is there a polynomial time algorithm to find the optimal strategy (and thus the value) of a stochastic zero-sum game with incomplete information? (like heads-up Texas hold’em poker).
It will be very nice to find even a sub-exponential algorithm for a stochastic zero-sum game with incomplete information like poker. Continue reading →
Ehud Friedgut reminded me of the game MEDIAN which I proposed many years ago.
There are three players and they play the game for eight rounds. In every round all players simultaneously say a number between 1 and 8. A player whose number is (strictly) between the other two get a point. At the end of the game the winner is the player whose number of points is strictly between those of the others.
I am very happy to announce that a Ph. D program in mathematics for international students at the Hebrew University of Jerusalem is now open. Here is the link to the home page.
About the program
The Einstein Institute of Mathematics at The Hebrew University in Jerusalem is offering PhD candidate positions for excellent international students.
The institute was inaugurated in 1925 by a lecture of Edmund Landau, who later served as one of the first heads of the department. It has since developed into a defining and leading place in mathematics research, with world renowned research faculty working in diverse areas of up-to-date research.
Our graduate program
Our graduate program gives students the chance to develop into researchers that shape mathematics of the future. The department offers a uniquely attractive environment to learn and work, with weekly seminars, frequent special lecture series on current topics in mathematics and scientific exchange with visiting researchers from around the world.
This is enriched further by the Israel Institute of Advanced Studies situated at Hebrew University that organizes thematic years on state-of-the art advances in science, and the close collaboration with the renowned departments of physics and computer science and engineering. You can venture even further and visit the nearby University of Tel Aviv, the Technion, the Weizmann Institute, Bar Ilan University, Ben Gurion University or the University of Haifa, that contribute to the active research environment and that we here enjoy a frequent and close scientific exchange with. Continue reading →
The workshop on the “polynomial method” will take place at the Hebrew University of Jerusalem on Monday Dec 26 and Tuesday Dec 27. The event is organized by Jordan Ellenberg and Gil Kalai.
Monday 10-11:45 (Combinatorics seminar) Adam Shefer – Geometric Incidences and the Polynomial Method
Location: Rothberg (CS) B220
On Monday afternoon we will have four talks at the library of Belgium house by
13:15-14:00 Peter Pach, Progression-free sets. New: SLIDES 14:10-14:55 Shoham Letzter,
15:15-16:00 Jordan Ellenberg, 16:10- 16:55 Fedya Petrov, Group rings vs. polynomials. New: SLIDES
and a problem session moderated by Jordan starting at 16:55. New: PROBLEMS.
On Tuesday we start at 9:30 and will have four talks at the library of Belgium house:
9:30-10:15 Noga Alon, Combinatorial Nullstellensatz and its algorithmic aspects. New: SLIDES 10:35-11:20 Olga Holtz, A potpourri on power ideals, hyperplane
arrangements, graphs, and zonotopes (NEW: SLIDES) 11:30- 12:15 Aart Blokhuis, The polynomial method in finite geometry
Wednesday 9:30-10:15, Anurag Bishnoi, zeros of polynomials over a finite grid. NEW:SLIDE.
Thursday 11:00-12:00 Seva Lev, Avoiding 3AP with differences in Room 209 Mathematics.
Further informal discussions and talks may continue on Wednesday/Thursday.
The Thursday 14:30 Colloquium by Jordan Ellenberg will be on The cap set problem.
I will update titles as they come along.
Happy Hanukkah, Merry X-mas and a Happy New Year!!
Blogging was slow recently, and I have various half written posts on all sort of interesting things, and plenty of unfulfilled promises. I want to quickly share with you two and a half news items regarding combinatorial designs. As you may remember Peter Keevash proved the century old conjecture on the existence of designs. We talked about it in this post, and again in this post. I wrote a Bourbaki exposition about this development and related results and problems. I gained some (rather incomplete) understanding of the proof by listening to Peter’s lectures, looking at the papers, talking to people, preparing to my lecture about it, writing the presentation, and incorporating remarks of people who read earlier versions and pointed out that I miss some important ingredient.
The existence of designs – second proof!
A new proof for Keevash’s theorem on the existence of designs was discovered by Stefan Glock, Daniela Kühn, Allan Lo, and Deryk Osthus! The proof is given in the paper The existence of designs via iterative absorption, and the paper contains also some new applications of the method of proof. This is great news! A second proof to a major difficult theorem is always very very important and exciting. Keevash’s theorem gave a vast generalization of the problem for decompositions of hypergraphs to complete subhypergraphs and the new theorem is even a much more general hypergraph decomposition theorem. Congratulations!
New q-analogs of designs
One of the important open problems about designs is the existence of q-analogs. The first example was given in 1987 by Simon Thomas. Michael Braun, Tuvi Etzion , Patric R. J. Östergard , Alexander Vardy, and Alferd Wasserman found remarkable new q-designs. See also this article: Researchers found mathematical structure that was thought not to exist. Congratulations! It is an interesting question if the new existence methods apply to q-analogs (and perhaps in greater generality for all sort of algebraic gadgets).
Some more things
As part of a project with Nati Linial and Yuval Peled I was interested in finding a k-dimensional simplicial complex on k(k-1) vertices with a complete (k-1)-dimensional skeleton, with vanishing rational homology so that every (k-1) face is included in the same number of k-faces. (This “same number” must be k.) Better still I want all links of i-faces to be combinatorially the same. For k=2 the 6-vertex triangulation of is an example, but I did not have any other example. I asked about it on MathOverflow and GNiklasch identified a remarkable example for k=3. (And there are some hopes for k=4.) Actually, I need to devote a post to MathOverflow experiences. I got answers there to several problems that intrigued me for decades.
One more thing: Daniela Kühn and Deryk Osthus were involved in recent years (sometimes with coauthors) in knocking out some very important problems in graph theory and extremal combinatorics. Their ICM14 survey describes some of their works related to Hamiltonian cycles including their solution to the famous Kelly’s conjecture.
Being again near general elections is an opportunity to look at some topics we talked about over the years.
I am quite fond of (and a bit addicted to) Nate Silver’s site FiveThirtyEight. Silver’s models tell us what is the probability that Hillary Clinton will win the elections. It is a very difficult question to understand how does the model relates to reality. What does it even mean that Clinton’s chances to win are 81.5%? One thing we can do with more certainty is to compare the predictions of one model to that of another model.
Some data from Nate Silver. Comparing the chance of winning with the chance of winning of the popular vote accounts for “aggregation of information,” the probability for a recount accounts for noise sensitivity. The computation for the winning probabilities themselves is also similar in spirit to the study of noise sensitivity/stability.
This data is month-old. Today, Silver gives probability above 80% for Clinton’s victory.
Nate Silver and information aggregation
Given two candidates “zero” and “one” and a fixed , suppose that every voter votes for “one” with probability and for “zero” with probability and that these events are statistically independent. Asymptotically complete aggregation of information means that with high probability (for large populations) “one” will win.
Aggregation of information for the majority rule was studied by Condorcet in what is known as the “Condorcet’s Jury theorem”. The US electoral rule which is a two-tier majority with some weights also aggregates information but in a somewhat weaker way.
The data in Silver’s forecast allows to estimate aggregation of information based on actual polls which give different probabilities for voters in different states. This is reflected by the relation between the probability of winning and the probability for winning the popular vote. Silver’s data allows to see for this comparison if the the simplistic models behave in a similar way to the models based on actual polls.
We talked about Condorcet’s Jury theorem in this 2009 post on social choice.
Marie Jean Nicolas Caritat, marquis de Condorcet (1743-1794)
Nate Silver and noise stability
Suppose that the voters vote at random and each voter votes for each candidate with probability 1/2 (again independently). One property that we ask from a voting method is that the outcomes of the election will be robust to noise of the following kind: Flip each ballot with probability t for . “Noise stability” means that if t is small then the probability of such random errors in counting the votes to change the identity of the winner is small as well. The majority rule is noise stable and so is the US election rule (but not as much).
How relevant is noise sensitivity/stability for actual elections? One way to deal with this question is to compare noise sensitivity based on the simple model for voting and for errors to noise sensitivity for the model based on actual polls. Most relevant is Silver’s probability for “recount.”
Nate Silver computes the probability of victory for every candidate based on running many “noisy” simulations based on the outcomes of the polls. (The way different polls are analyzed and how all the polls are aggregated together to give a model for voters’ behavior is a separate interesting story.)
We talked about noise stability and elections in this 2009 post (and other places as well).
Nate Silver and measures of power.
The Banzhaf power index is the probability that a voter is pivotal (namely her vote can determine the outcome of the election) based on each voter voting with equal probability to each candidate. The Shapley-Shubik power index is the probability that a voter is pivotal under a different a priory distribution for the individual voters (under which the votes are positively correlated). Nate silver computes certain power indices based on the distribution of votes in each states as described by his model. Of course, voters in swing states have more power. It could be interesting to compare the properties of the abstract power indices and the more realistic ones from FiveThirtyEight. For example, the Banzhaf power indices sum up to the square root of the size of the population, while the Shapley-Shubik power indices sum up to one. It will be interesting to check the sum of pivotality probabilities under Silver’s model. (I’d guess that Silver’s model is closer to the Shapley-Shubik behavior.)
We talked about elections, coalition forming and power measures here, here and here.
Nate Silver and the Hex election rule
In some earlier post we considered (but did not recommend) the HEX election rule. FiveThirtyEight provides a tool to represent the states of the US on a HEX board where sizes of states are proportional to the number of electoral votes.
According to the HEX rule one candidates wins by forming a continuous right-left path of winning states, and the other wins by blocking every such path or, equivalently, by forming a north-south path of winning states. The Hex rule is not “neutral” (symmetric under permuting the candidates).
If we ask for winning a north-south path for red and an east-west path for blue then red wins. For a right-left blue path much attention should be given to Arizona and Kansas.
If we ask for winning a north-south path for blue and an east-west path for red then blue wins and the Reds’ best shot would be to try to gain Oregon.
Now with the recent rise of the democratic party in the polls it seems possible that we will witness two disjoint blue north-south paths (with Georgia) as well as a blue east-west path. For a percolation-interested democratically-inclined observer (like me), this would be beautiful.
The mathematics of information aggregation and noise stability, and the anomaly of majority
One way to consider both two basic properties of the majority rule as sort of stability to errors is as follows:
a) (Information aggregation reformulated) If all voters vote for the better candidate and with some probability a ballot will be flipped, then with high probability as the number of voters grows, the better candidate still wins.
We can also consider a weak form of information aggregation where is a fixed small real number. One way to think about this property is to consider an encoding of a bit by a string on n identical copies. Decoding using the majority rule have good error-correction capabilities.
b) (Noise stability) If all voters vote at random (independently with probability 1/2 for each candidate) and with some small probability a ballot will be flipped, then with high probability (as get smaller) this will not change the winner.
The “anomaly of majority” refers to these two properties of the majority rule which in terms of the Fourier expansion of Boolean functions are in tension with each other.
It turns out that for a sequence of voting rules, information aggregation is equivalent to the property that the maximum Shapley-Shubik power of the players tends to zero. (This is a theorem I proved in 2002. The quantitative relations are weak and not optimal.) Noise stability implies a positive correlation with some weighted majority rule, and it is equivalent to approximate low-degree Fourier representation. (These are results from 1999 of Benjamini Schramm and me.) Aggregation of information when there are two candidates implies a phenomenon called indeterminacy when there are many candidates.
The anomaly of majority is important for the understanding of why classical information and computation is possible in our noisy world.
Frank Wilczek, on of the greatest physicists of our time, wrote in 2015 a paper about future physics were he (among many other interesting things) is predicting that quantum computers will be built! While somewhat unimpressed by factoring large integers, Wilczek is fascinated by the possibility that
A quantum mind could experience a superposition of “mutually contradictory” states
Now, imagine quantum elections where the desired outcome of the election is going to be a superposition Hilary and Donald (Or Hillary’s and Donald states of mind, if you wish.) For example |Hillary> PLUS |Donald>.
Can we have a quantum voting procedure which has both a weak form of information aggregation and noise stability? Weak form of information aggregation amounts for the ability to correct a small fraction of random errors. Noise stability amounts to decoding procedure which is based on low-degree polynomials. Such procedures are unavailable and proving that they do not exist (or disproving it) is on my “to do” list.
The fact that no such quantum mechanisms are available appears to be important for the understanding of why robust quantum information and quantum computation is not possible in our noisy world!
Quantum election and a quantum Arrow’s theorem were considered in the post “Democrat plus Republican over the square-root of two” by Nicole Yunger Halpern over the “Quantum Frontiers”. Not having a quantum analog for the anomaly of majority is related to various discussions we had on quantum computing. Most recently in this post.
Nate Silver’s 2008 fifty home runs
One last point. I learned about Nate Silver from my friend Greg Kuperberg, and probably from his mathoverflow answer to a question about mathematics and social science. There, Greg wrote referring to the 2008 elections: “The main person at this site, Nate Silver, has hit 50 home runs in the subject of American political polling.” Indeed, in the 2008 elections Silver correctly predicted who will win in each of the 50 states of the US. This is clearly impressive but does it reflect Silver’s superior methodology? or of him being lucky? or perhaps suggests some problems with the methodology? (Or some combination of all answers?)
One piece of information that I don’t have is the probabilities Silver assigned in each state in 2008. Of course, these probabilities are not independent but based on them we can estimate the expected number of mistakes. (In the 2016 election the expected number of mistakes in state-outcomes is today above five.) Also here, because of dependencies the expected value accounts also for some substantial small probability for many errors simultaneously. Silver’s methodology allows to estimate the actual distribution of “for how many states the predicted winner will lose?” (This estimation is not given on the site.)
Now, suppose that the number of such errors is systematically lower than the predicted number of errors. If this is not due to lack, it may suggest that the probabilities for individual states are tilted to the middle. (It need not necessarily have bearing on the presidential probabilities.)
A major addiction problem…
Would you decide by yourself the elections if you could?
One mental experiment I am fond of asking people (usually before elections) is this: Suppose that just a minute before the votes are counted you can change the outcome of the election (say, the identity of the winner, or even the entire distribution of ballots) according to your own preferences. Let’s assume that this act will be completely secret. Nobody else will ever know it. Will you do it?
In 2008 we ran a post with a poll about it.
We can run a new poll specific to the US 2016 election.
Reflections about elections
I really like days of elections and their special atmosphere in Israel where I try never to miss them, and also in the US (I often visit the US on Novembers). I also believe in democracy as a value and as a tool. Often, I don’t like the results but usually I can feel happy for those who do like the results. (And by definition, in some sense, most people do like the outcomes.)
Live streaming for Avifest is available here. The program is here. Following the first two lectures I can witness that the technical quality of the broadcast is very good and the scientific quality of the lectures is superb. As this is posted Dick Karp have started his lecture. Go for it!!
“Groups have always played a central role in the different branches of Algebra, yet their importance goes far behind the limits of Algebraic research. One of the most significant examples for this is the work of Alex Lubotzky. Over the last 35 years, Alex has developed and applied group theoretic methods to different areas of mathematics including combinatorics, computer science, geometry and number theory. His research led the way in the study of expander graphs, p-adic Lie groups, profinite groups, subgroup growth and many geometric counting problems. The 20th Midrasha, dedicated to Alex’s 60th birthday, will include lectures from leading mathematicians in these fields presenting their current work”
My friendship with Alex goes well over the last forty years, we shared exotic experiences in the Jordan River and the Amazon River, shared apartments at Yale, taught a course together 5 times and more.
My young friend and colleague Karim Adiprasito told me that he has funding for postdocs (with or without teaching) and students (with or without teaching), both at the Hebrew University of Jerusalem (HUJI) and the MPI/University Leipzig. Both places have a great combinatorics group, as well as highly active research groups in other areas and beautiful surroundings. If you are interested in or around the type of things Karim is doing – please send him an email to firstname.lastname@example.org (with an appropriate subject that reflects your intention to apply);
MPI=Max Plank Institute.
Further postdoctoral opportunities at HUJI.
The HUJI part of the announcement above can be generalized! Like every year we do have funding for several postdoctoral positions in and around combinatorics for 1-3 years here at the Hebrew University of Jerusalem. The starting time is flexible. If you do research in combinatorics or in related areas you may enjoy our special environment (and weather, and sights), our lovely group of combinatorialists both in the mathematics and the computer science departments, and the other great research groups around.
An International Ph. D. Program in mathematics starting fall 2017 at HUJI.
We already had a few Ph. D. students coming from other countries before, but starting Fall 2017 we will make special effort to attract and accommodate foreign Ph D students.
A 2-3 days workshop about the polynomial method at HUJI around X-mas 2016.
Jordan Ellenberg and I are planning an informal workshop about the “polynomial method” around Christmas 2016 in Jerusalem.