Monthly Archives: December 2009

Randomness in Nature

Here is an excellent question asked by Liza on “Mathoverflow“.

What is the explanation of the apparent randomness of high-level phenomena in nature? For example the distribution of females vs. males in a population (I am referring to randomness in terms of the unpredictability and not in the sense of it necessarily having to be evenly distributed).

1. Is it accepted that these phenomena are not really random, meaning that given enough information one could predict it? If so isn’t that the case for all random phenomena?

2. If there is true randomness and the outcome cannot be predicted – what is the origin of that randomness? (is it a result of the randomness in the micro world – quantum phenomena etc…)

Where can I find resources about the subject?

Some answers and links can be found below the question in MO. (The question was closed after a few hours.) More answers and further discussion are welcome here.

Related posts:  noiseThomas Bayes and probability, four derandomization problems, some philosophy of science

And here is a related post on probability by Peter Cameron relating to the question “what is probability”.

Midrasha News

Our Midrasha is going very very well. There are many great talks, mostly very clear and helpful. Various different directions which interlace very nicely. Some moving new mathematical breakthroughs; very few fresh from the oven. Tomorrow is the last day.

Update: I will try to describe some highlights an links to the presentations, related papers, and pictures at a later post.

Joe Malkevitch: Why Planar Graphs are so Exceptional


Not only do interesting questions arise by considering the special class of planar graphs but additional special issues arise when one considers a specific plane drawing of a planar graph. This is because when a graph is drawn in the plane it becomes possible to order or number, say for example, the edges at a particular vertex of the graph, in an organized consistent way.

One example of results which started from this reality for plane graphs is this paper of Grunbaum and Motzkin: B. Grünbaum and T. S. Motzkin , The number of hexagons and the simplicity of geodesics on certain polyhedra. Canad. J. Math. 15 (1963), pp. 744–751. In this paper the following is explored (along with results about what today are called fullerenes). Suppose one has a 3-valent plane graph. If one picks any edge and moves along that edge (in either direction), when one gets to a vertex one has the choice of going left or right and moving on to another edge. Suppose one goes left, and at the next vertex in one’s “traversal” one goes right, alternating left, and right. Grunbaum and Motzkin refer to this as left-right path, and they prove that for a plane 3-valent graph with each of its faces having a multiple of 3 for its number of sides, that these left-right paths (starting on any edge) always generate “simple circuits.”

I was able to extend this result in several directions. Continue reading

When Noise Accumulates

I wrote a short paper entitled “when noise accumulates” that contains the main conceptual points (described rather formally) of my work regarding noisy quantum computers.  Here is the paper. (Update: Here is a new version, Dec 2010.) The new exciting innovation in computer science conference in  Beijing seemed tailor made for this kind of work, but the paper did not make it that far. Let me quote the first few paragraphs. As always, remarks are welcome!

From the introduction: Quantum computers were offered by Feynman and others and formally described by Deutsch, who also suggested that they can outperform classical computers. The idea was that since computations in quantum physics require an exponential number of steps on digital computers, computers based on quantum physics may outperform classical computers. A spectacular support for this idea came with Shor’s theorem that asserts that factoring is in BQP (the complexity class described by quantum computers).

The feasibility of computationally superior quantum computers is one of the most fascinating and clear-cut scientific problems of our time. The main concern regarding quantum-computer feasibility is that quantum systems are inherently noisy. (This concern was put forward in the mid-90s by Landauer, Unruh, and others.) 

The theory of quantum error correction and fault-tolerant quantum computation (FTQC) and, in particular, the threshold theorem which asserts that under certain conditions FTQC is possible, provides strong support for the possibility of building quantum computers. 

However, as far as we know, quantum error correction and quantum fault tolerance (and the highly entangled quantum states that enable them) are not experienced in natural quantum processes. It is therefore not clear if computationally superior quantum computation is necessary to describe natural quantum processes.

We will try to address two closely related questions. The first is, what are the properties of quantum processes that do not exhibit quantum fault tolerance and how to formally model such processes. The second is, what kind of noise models cause quantum error correction and FTQC to fail.

A main point we would like to make is that it is possible that there is  a systematic relation between the noise and the intended state of a quantum computer. Such a systematic relation does not violate linearity of quantum mechanics, and it is expected to occur in processes that do not exhibit fault tolerance.

Let me give an example: suppose that we want to simulate on a noisy quantum computer a certain bosonic state. The standard view of noisy quantum computers asserts that under certain conditions this can be done up to some error that is described by the computational basis. In contrast, the type of noise we expect amounts to having a mixed state between the intended bosonic state and other bosonic states (that represent the noise).

Criticism: A criticism expressed by several readers of an early version of this paper is that no attempt is made to motivate the conjectures from a physical point of view and that the suggestions seem “unphysical.” What can justify the assumption that a given error lasts for a constant fraction of the entire length of the process? If a noisy quantum computer at a highly entangled state has correlated noise between faraway qubits as  we suggest, wouldn’t it allow signaling faster than the speed of light?

I had sort of a mixed reaction toward this criticism. On the one hand I think that it is important and may be fruitful  to examine various models of noise while putting the physics aside. Nevertheless, I added a  brief discussion of  some physical aspects. Here it is: 

Continue reading

Plans for polymath3

Polymath3 is planned to study the polynomial Hirsch conjecture. In order not to conflict with Tim Gowers’s next polymath project which I suppose will start around January, I propose that we will start polymath3 in mid April 2010. I plan to write a few posts on the problem until then. We had a long and interesting discussion regarding the Hirsch conjecture followed by another interesting discussion. We can continue the discussion here.

One direction which I see as promising is to try to examine the known upper and lower bounds for the abstract problem.  Here is again a  link for the paper Diameter of Polyhedra: The Limits of Abstraction by Freidrich Eisenbrand, Nicolai Hahnle,  Sasha Razborov, and Thomas Rothvoss.

I would also be happy to hear your thoughts about strong polynomial algorithms for linear programming either via randomised pivot rules for the simplex algorithm or by any other method.  It occured to me that I am not aware of any work trying to examine the possibility that there is no strongly polynomial algorithm for linear programming. Also I am not aware of any work which shows that strongly polynomial algorithm for LP is a consequence of some (however unlikely) computational assumption. (Is it a consequence of NP=P? of P=P-space?)


Four Derandomization Problems

Polymath4 is devoted to a question about derandomization: To find a deterministic polynomial time algorithm for finding a k-digit prime.  So I (belatedly) devote this post to derandomization and, in particular, the following four problems.

1) Find a deterministic algorithm for primality

2) Find a combinatorial characterization (and a deterministic algorithm) for generically 3-rigid graphs

3) Find an explicit construction for Ramsey graphs

4) Find a deterministic method for polling

(Remark: I was very slow in writing this post, and I had to give up elaborating the details on some very nice topics in order to finish it.)


Continue reading

Why are Planar Graphs so Exceptional

Harrison Brown asked the problem “Why are planar graphs  so exceptional” over mathoverflow, and I was happy to read it since it is a problem I have often thought about over the years, as I am sure have  many combinatorialsists and graph theorists. Harrison was interested in comparing planar graphs to graphs embedded in other surfaces.

It will be nice to discuss this question further here (mathoverflow is not an ideal platform for discussions, but some interesting ones have emerged.) So let me offer my answer. Another interesting answer was offered by Joseph Malkevitch.

Three exceptional characteristics of planar graphs

Duality: Perhaps duality is the crucial property of planar graphs. There is a theorem asserting that the dual of a graphic matroid M is a graphic matroid if and only if M is the matroid of a planar graph. In this case, the dual of M is the matroid of the dual graph of G. (See this wikipedia article). This means that the circuits of a planar graph are in one to one correspondence with cuts of the dual graph.

One important manifestation of the uniqueness of planar graphs (which I believe is related to duality) is Kasteleyn’s formula for the number of perfect matchings and the connection with counting trees.

Robust geometric descriptions: Another conceptual difference is that (3-connected or maximal) planar graphs are graphs of convex 3-dimensional polytopes and thus have extra geometric properties that graphs on surfaces do not share. (This is Steinitz’s theorem.)

The geometric definition of planar graphs (unlike various generalizations) is very robust. A graph is planar if it can be drawn in the plane such that the edges are represented by Jordan curves and do not intersect in their interiors; the same class of planar graphs is what we get if we replace “Jordan curves” by “line intervals,” or if we replace “no intersection” by “even number of crossings”. The Koebe-Andreev-Thurston theorem allows us to represent every planar graph by the “touching graph” of nonoverlapping circles. Both (related) representations, via convex polytopes and by circle packing, can respect the group of automorphisms of the graph and its dual.

Simple inductive constructions. Another exceptional property of the class of planar graphs is that planar graphs can be constructed by simple inductive constructions. (In this respect they are similar to the class of trees, although the inductive constructions are not as simple as for trees.) This fails for most generalizations of planar graphs.

A related important property of planar graphs, maps, and triangulations (with labeled vertices) is that they can be enumerated very nicely. This is Tutte theory. (It has deep extensions to surfaces.)

It is often the case that results about planar graphs extend to other classes. As I mentioned, Tutte theory extends to triangulations of other surfaces. Another example is the fundamental Lipton-Tarjan separator theorem, which extends to all graphs with a forbidden minor.

Two interesting generalizations

Continue reading