When do we say that one event causes another? Causality is a topic of great interest in statistics, physics, philosophy, law, economics, and many other places. Now, if **causality** is not complicated enough, we can ask what is the **influence** one event has on another one. Michael Ben-Or and Nati Linial wrote a paper in 1985 where they studied the notion of influence in the context of **collective coin flipping**. The title of the post refers also to Nati’s influence on my work since he got me and Jeff Kahn interested in a conjecture from this paper.

## Influence

The word “influence” (dating back, according to Merriam-Webster dictionary, to the 14th century) is close to the word “fluid”. The original definition of influence is: “an ethereal fluid held to flow from the stars and to affect the actions of humans.” The modern meaning (according to Wictionary) is: “The power to affect, control or manipulate something or someone.”

## Ben-Or and Linial’s definition of influence

Collective coin flipping refers to a situation where n processors or agents wish to agree on a common random bit. Ben-Or and Linial considered very general protocols to reach a single random bit, and also studied the simple case where the collective random bit is described by a Boolean function of n bits, one contributed by every agent. If all agents act appropriately the collective bit will be ’1′ with probability 1/2. The purpose of collective coin flipping is to create a random bit R which is immune as much as possible against attempts of one or more agents to bias it towards ’1′ or ’0′.

Given such a protocol, the influence of a set S of agents towards ’0′ is the probability that R=0 if the agents in S try to tilt the outcomes of the coin flipping towards ’0′ as much as possible. The influence towards ’1′ is defined in the same way. And the influence of S is the sum of these two quantities. To make the definition clearer we should explain what the agent in S can do. Here we assume that in case of simultaneous action by all agents, the “bad guys” can wait to the contributions of all other agents before making their move. The bad guys can only change their inputs to the procedure.

**Notations**: When the protocol is denoted by , for a set of processors, denote by their influence toward ’1′, by their influence toward ’0′, and let . The influence of a single processor is denoted by and the sum is called the **total influence** of . (In the Boolean case we refer to a “processor” or “agent” simply as a variable, and talk about influence of a variable, and influence of a set of variables.)

## Notions of powers

Boolean functions can be regarded as “voting rules”. A Boolean function describes a way to move from the votes of n voters between two candidates to the collective decision of the society. Thinking of Boolean functions as voting rules provides nice names for special kinds of Boolean functions. “Dictatorship” refers to functions of the form . For the “majority function” (when is odd) the value of is ’1′ if and only if for more than half the variables . For monotone Boolean functions, the influence of the kth variable coincides with the “Banzhaf power index” defined in game theory. Another related important notion of power is the Shapley-Shubik power index.

## The KKL Theorem

Picture: Muli Safra

After many months of working on the conjecture, Jeff, Nati, and I managed to prove it. Let be a Boolean function.

**Theorem 1 (KKL): **If the then there exists a variable k so that .

A repeated application of this theorem shows that:

**Theorem 2 (KKL): **If the then there exists a set S of variables so that .

## Examples, examples

**Majority**

For the majority function with n variables the influence of every variable is proportional to .

**The tribes example**

This is the basic example of Ben-Or and Linial for Boolean functions with low influence. The society is divided into a large number of tribes, each having members. The value of f is one if and only if there exists a tribe whose members all vote ’1′. For this example the influence of every variable is .

The next two examples represent more complicated protocols for collective coin flipping. (Not just Boolean functions.)

**Mike Saks’ “passing the baton” example**

We start with some voter who holds the baton. This voter passes the baton to another random voter. Every voter who gets the baton passes it to a random voter who did not yet hold it. The last voter to hold the baton chooses the random collective bit. In this example, bad agents cannot tilt the outcome significantly. (The best strategy for the “bad guys” is to pass the baton to a “good guy”.)

**Uri Feige’s “two rooms” example**

Every agent enters at random one out of two rooms. The room with fewer agents is selected and every agent in this room enters at random one out of two rooms. This process is continued (more or less) and at the end, as before, the last remaining agent contributes the collective random bit.

This process is immune against a constant number of “bad guys”. (The first such example was found by Noga Alon and Moni Naor.) The number of rounds in this protocol (appropriately optimized) goes to infinity extremely slowly. It is not known whether there is a protocol with similar properties with a bounded number of rounds.

## Influence and threshold behavior

Consider a monotone Boolean function . Let be the product probability space where for every bit The definition of influence extends without change to the setting of biased product distribution. The influence of the th variable on the Boolean function f with respect to is denoted by . The probability that is a monotone function in and **Russo’s lemma **asserts that the derivative of with respect to is precisely the total influence . Therefore, large influence is related to “sharp threshold behavior”. Namely, to a very short interval between the value of where is very close to 0, and the value of where is very close to 1. Simple consequences of KKL’s theorem to the study of threshold behavior were noted by Ehud Friedgut and me, and Friedgut found an important theorem giving conditions for sharp threshold behavior when p itself is a function of n. Muli Safra and I wrote a survey paper on influences and threshold behavior.

## Aggregation of information and Condorcet’s Jury theorem

Four sentences about the connection with Game Theory: The sharp threshold phenomenon is called in economics “asymptotically complete aggregation of information”. This property goes back to an old theorem from the theory of voting called “Condorcet’s Jury Theorem“. The Shapley-Shubik power index, mentioned above, can be defined as the integral . (This is not the original axiomatic definition but a later Theorem by Owen.) It turns out that for a sequence of monotone Boolean functions, “sharp threshold phenomenon” is equivalent to “diminishing individual Shapley-Shubik power indices”. (But the quantitative aspects of this result are not satisfactory.)

## Influence without independence

A major conceptual challenge is to understand the concept of influence (and related notions of “sharp threshold phenomenon”) for distributions which are not product distributions, namely when the probabilities for individual bits to be ’1′ are **not** statistically independent. Does an observer in a committee meeting have an influence? Can there be a negative influence? Moving away from statistical independence is often very difficult and yet very important for most applications. This issue is addressed in the paper of Graham and Grimmett, and that of Haggstrom, Mossel, and myself.

## Two old conjectures about influence

There are quite a few problems regarding influences which remained unsolved. I will mention only two related conjectures both dealing with the Boolean case.

**Conjecture 1:** **(Benny Chor): **Suppose that there is , such that there is a set with .

**Conjecture 2**: Suppose that . Then there is a set , so that .

KKL theorem gives a weak form of Conjecture 1 where is replaced by and a weak form of Conjecture 2 where is replaced by . The main difficulty here (and in various other problems in extremal combinatorics) is that arguing about influences of single variables is the only known method toward influences of large sets. Both these conjectures (as KKL’s theorem itself) have natural formulations in terms of traces of families of sets related to the Sauer-Shelah Theorem.

**Update** (February 2013): Both these conjectures have now been disproved in a paper with Jeff Kahn: Functions without influential coalitions.

## More on Influence

The theory of poetic influence was developed by Yale’s literary critic Harold Bloom. His book “The Anxiety of Influence” deals with the process of influence as well as with the psychology of influence in literature. Philosopher Avishai Margalit studied influence in Philosophy in his paper on Wittgenstein. The paper will appear in a Festschrift for Peter Hacker by Blackwell Oxford, 2009. Section 3 opens with a distinction between influence and power: “Naked power is for anyone to see. Influence is not. It works its wonders in ways not readily observable. Influence is inferred from its effects. This is a major reason for the elusive nature of influence.”

Michael Ben-Or

## Trivia question:

What do Michael Ben-Or, Uzi Segal (whose calibration theorem was mentioned in the post on the controversy around expected utility theory), Mike Werman, Ehud Lehrer, Yehuda Agnon, myself, and quite a few others, all have in common.

Hint:

### Influence and coin flipping little updates (June, 8)

Mike Saks made valuable comments regarding this post. The first protocol for collective coin-flipping immune to a positive proportion of processors by protocol with rounds was by Alexander Russell and David Zuckerman. If each processor contributes only one bit per round there is a matching lower bound by Russell, Saks, and Zuckerman. The open problem Section from RSZ’s paper is very interesting and, as far as we know, no further progress on the problems presented there was made. The Baton passing method was suggested also by Shafi Goldwasser.

If we allow every processor to contribute many bits in every round then the situation is not clear even for one round. This is related to conjectures discussed by Ehud Friedgut.

Related problems are of great interest for quantum computation. Carlos Mochon have recently solved one of the most important longstanding open problems in quantum information theory, and found a protocol for weak coin flipping with arbitrarily small bias, using quantum bits. His work strengthens earlier work by Kitaev.

Related post: The Entropy Influence Conjecture. (Terry Tao’s blog)

Can you please the exact ref to Avishai Margalit’s quote?

Dear Alef, I will add a link to the paper.

Pingback: Plans and Updates « Combinatorics and more

Pingback: Bad Advice; An Answer to an Old Trivia Question « Combinatorics and more

Pingback: Which Coalition? « Combinatorics and more

Pingback: Noise Sensitivity Lecture and Tales « Combinatorics and more

Gil,

I have been curious for a long time now how you (three) came up with the idea of using the Bonami-Beckner inequality in the proof of the KKL theorem. Certainly pre-kkl, this wasn’t a very well known inequality. It is one of those great ideas that I have always marveled at, and (discovering your blog) can’t resist asking about!

Thanks a lot. I plan to have a post about the proof of KKL’s theorem and I hope others will also like it.

Pingback: Social Choice Talk « Combinatorics and more

Pingback: Noise Stability and Threshold Circuits « Combinatorics and more

Pingback: Octanions to the Rescue | Combinatorics and more

Pingback: Analysis of Boolean Functions | Combinatorics and more

Pingback: Noise Sensitivity and Percolation. Lecture Notes by Christophe Garban and Jeff Steif | Combinatorics and more

Pingback: The Quantum Debate is Over! (and other Updates) | Combinatorics and more

The two conjectures on influence of large coalitions mentioned in the post have now been disproved by Jeff Kahn and me. (See the update in the post for a link to the paper.)

Pingback: Analysis of Boolean Functions – week 1 | Combinatorics and more

Pingback: Influence, Threshold, and Noise | Combinatorics and more