## Kelman, Kindler, Lifshitz, Minzer, and Safra: Towards the Entropy-Influence Conjecture

Let me briefly report on a remarkable new paper by Esty Kelman, Guy Kindler, Noam Lifshitz, Dor Minzer, and Muli Safra, Revisiting Bourgain-Kalai and Fourier Entropies. The paper describes substantial progress towards the Entropy-Influence conjecture, posed by Ehud Friedgut and me in 1996. (See this blog post from 2007.)

Abstract: The total inﬂuence of a function is a central notion in analysis of Boolean functions, and characterizing functions that have small total inﬂuence is one of the most fundamental questions associated with it. The KKL theorem and the Friedgut junta theorem give a strong characterization of such functions whenever the bound on the total inﬂuence is o(logn), however, both results become useless when the total inﬂuence of the function is ω(logn). The only case in which this logarithmic barrier has been broken for an interesting class of function was proved by Bourgain and Kalai, who focused on functions that are symmetric under large enough subgroups of $S_n$.

In this paper, we revisit the key ideas of the Bourgain-Kalai paper. We prove a new general inequality that upper bounds the correlation between a Boolean function f and a real-valued, low degree function g in terms of their norms, Fourier coefﬁcients and total inﬂuences.

Some corollaries of this inequality are:

1. The Bourgain-Kalai result.
2. A slightly weaker version of the Fourier-Entropy Conjecture. More precisely, we prove that the Fourier entropy of the low-degree part of f is at most $O(I[f]log^2 I[f])$, where I[f] is the total inﬂuence of f. In particular, we conclude that the Fourier spectrum of a Boolean function is concentrated on at most $2O(I[f]log^2 I[f])$ Fourier coefﬁcients.
3. Using well-known learning algorithms of sparse functions, the previous point implies that the class of functions with sub-logarithmic total inﬂuence (i.e. at most $O(\log n/(\log \log n)2))$ is learnable in polynomial time, using membership queries.

Our theorem on influence under symmetry. From my videotaped lecture on Jean Bourgain. See this post about Jean Bourgain.

## A few remarks:

A) Given a Boolean function $f:\{-1,1\}^n\to \{-1,1\}$ let $f=\sum_{S \subset \{1,2,\dots ,n\}}\hat f(S)W_S$ be its Fourier-Walsh expansion. (Here $W_S(x_1,x_2,\dots ,x_n)=\prod_{i\in S}x_i$.) The total influence $I(f)$ of $f$ is described by

$I(f)=\sum {S \subset \{1,2,\dots ,n\}}\hat f^2(S)|S|$.

The spectral entropy $E(f)$ of $f$ is defined by

$E(f)=\sum_{S \subset \{1,2,\dots ,n\}}\hat f^2(S) \log (\hat f^2(S))$.

The entropy-influence conjecture (here called the Fourier-entropy conjecture) asserts that for some universal constant C

$I(f) \ge C E(f)$.

B) One interesting feature of the conjecture is that the RHS is invariant under arbitrary linear transformations of $\{-1,1\}^n$ (regarded as an $n$-dimensional vector space) while the LHS is invariant only to permutations of the variables.

C) My paper with Jean Bourgain, Influences of variables and threshold intervals under group symmetries, describes lower bounds on total influences for Boolean function invariant under certain groups of permutations (of the variables). Those results are stronger (but more restrictive) than what the Entropy/influence conjecture directly implies. (This was overlooked for a while.) The new paper gives (much hoped for, see below) simpler proofs and stronger results compared to those in my paper with Jean Bourgain.

D) Ryan O’Donnel wrote about Bourgain and some of his contributions to the analysis of Boolean functions:

“I spent close to 5 years understanding one 6-page paper of his (“On the distribution of the Fourier spectrum of Boolean functions”), and also close to 10 years understanding a 10-page paper of his (the k-SAT sharp threshold ‘appendix’). If anyone’s up for a challenge, I’m pretty sure no one on earth fully understands the second half of his paper “Influences of variables and threshold intervals under group symmetries” with Kalai (including Gil 🙂 )”

It looks that by now we have pretty good understanding and even some far-reaching progress regarding all three papers that Ryan mentioned. (It is unclear if we can hope now for an exponential spread of understanding for Bourgain’s proofs.)

This entry was posted in Combinatorics, Computer Science and Optimization, Open problems and tagged , , , , . Bookmark the permalink.

### 1 Response to Kelman, Kindler, Lifshitz, Minzer, and Safra: Towards the Entropy-Influence Conjecture

1. Gil Kalai says:

Here is a link to two videotaped lectures on the new proof by Dor Minzer https://simons.berkeley.edu/events/boolean-1