Bounded Independence, AC0, and Random 3SAT

As reported here, here and here, Mark Braverman has just announced a proof of a 1990 conjecture by Linial and Nisan.

Mark proves that if {C} is an AC0 boolean circuit (with NOT gates and with AND gates and OR gates of unbounded fan-in) of depth {d} and size {S}, and if {X} is any {k}-wise independent distribution with {k = (\log S)^{O(d^2)}}, then

\displaystyle  {\mathbb E} C(U_n) \approx {\mathbb E} C(X)

that is, {X} “fools” the circuit {C} into thinking that {X} is the uniform distribution {U_n} over {\{0,1\}^n}. Plausibly, this might be true even for {k = O((\log S)^{d-1})}.

Nothing was known for depth 3 or more, and the depth-2 case was settled only recently by Bazzi, with a proof that, as you may remember, has been significantly simplified by Razborov about six months ago.

Mark’s proof relies on approximating {C} via low-degree polynomials. The point is that if {p} is an {n}-variate (real valued) polynomial of degree {\leq k}, and {X} is a {k}-wise independent distribution ranging over {\{0,1\}^n}, then

\displaystyle  {\mathbb E} p(X) = {\mathbb E} p(U_n)

Now if we could show that {p} approximates {C} both under {X} and under {U_n}, in the sense that {{\mathbb E} p(X) \approx {\mathbb E} C(X)}, and also {{\mathbb E} p(U_n) \approx {\mathbb E} C(U_n)}, then we would be done.

The Razborov-Smolenski lower bound technique gives a probabilistic construction of a polynomial {p} such that for every input {x} one has a high probability that {p(x) = C(x)}. In particular, one get one polynomial {p} such that both

\displaystyle   {\mathbb P}_{x\sim X} [p(x) = C(x) ] = 1-o(1) \ \ \ \ \ (1)

and

\displaystyle   {\mathbb P}_{x\sim U_n} [p(x) = C(x) ] = 1-o(1) \ \ \ \ \ (2)

Unfortunately this is not sufficient, because the polynomial {p} might be very large at a few points, and so even if {p} agrees with {C} with high probability there is no guarantee that the average of {p} is close to the average of {C}.

Using a result of Linial, Mansour and Nisan (developed in the context of learning theory), one can construct a different kind of low-degree approximating polynomial {p}, which is such that

\displaystyle   {\mathbb E}_{x\sim U_n} | C(x) - p(x)|^2 = o(1) \ \ \ \ \ (3)

The Linial-Mansour-Nisan approximation, however, says nothing about the relation between {p} and {C} under the distribution {X}.

Using ideas of Bazzi’s, however, if we had a single polynomial {p} such that properties (1), (2) and (3) are satisfied simultaneously, then we could construct another low-degree polynomial {p'} such that {{\mathbb E} p'(X) \approx {\mathbb E} C(X)}, and also {{\mathbb E} p'(U_n) \approx {\mathbb E} C(U_n)}, giving us that {C} is fooled by {X}.

As far as I understand, Mark constructs a polynomial satisfying properties (1), (2) and (3) by starting from the Razborov-Smolenski polynomial {p}, and then observing that the indicator function {E} of the points on which {p \neq C} is itself a boolean function admitting a Linial-Mansour-Nisan approximation {e}. Defining {p' := p\cdot (1-e)}, we have that {p'} has all the required properties, because multiplying by {1-e} “zeroes out” the points on which {p} is excessively large.

I have been interested in this problem for some time because of a connection with the complexity of 3SAT on random instances.

Continue reading