I was at a conference last week, at which I heard two pieces of mathematical gossip.

One was that Arora, Barak and Steurer have developed an algorithm that, given a Unique Games in which a fraction of constraints are satisfiable, finds an assignment satisfying a constant fraction of constraints in time . This is now officially announced in an earlier (public) paper by Arora, Impagliazzo, Matthews and Steurer, which presented a slower algorithm running in time where .

I suppose that the next targets are now the approximation problems for which the only known hardness is via unique games. Is there a subexponential algorithm achieving approximation for sparsest cut or approximation for Vertex Cover?

The other news is on the Polynomial Ruzsa-Freiman conjecture, one of the main open problems in additive combinatorics. Apologies in advance to readers if I get some details wrong. In the special case of , the conjecture is that if is any function such that

then there is a matrix and a vector such that

where the probability in the conclusion is independent of and is polynomial in . Various proofs were known achieving a bound of . The first proof, due to Samorodnitsky achieves, I believe, a bound of , while the results from this paper should imply a bound of where we use the notation .

At the conference, Ben Green announced a result of Schoen implying a subexponential bound of .

The proof begins with the standard step of applying a theorem of Ruzsa to find a subset such that , and on is a “Freiman 16-homomorphism,” meaning that for every 32 elements of such that

we have

The choice of 16 is just whatever makes the rest of the proof work. The theorem of Ruzsa works for any arbitrarily large constant. Then we consider the set of all the elements that can written as with , and we define a function on by setting

which is a well-posed definition because is a Freiman 16-homomorphism. (It would have been sufficient if had been an 8-homomorphism.) Note that for every such that we have

Now the result of Schoen implies that if is a subset of of size , then there is a subspace of dimension such that is entirely contained in .

Previously, it was known how to get a subspace of dimension , by adapting a technique of Chang (see this survey paper).

Note that is now a linear map on , and that it agrees with on . (I cannot reconstruct how this last step follows, however — maybe that claim is that agrees with on a fraction of elements of ? ) It now just remains to extend, arbitrarily, to a linear map over the entire space.

Can you please expound upon why the 2^n^{eps} alg of ABS is a breakthrough?

Are there no other NP-hard problems that have an algorithm with such a running time?

Does this mean that distinguishing between 1-eps and eps instances of UG cannot be NP-hard?

Thanks!!

If one NP-complete problem had algorithms of running time , then all problems in NP would have such algorithms, and this seems very unlikely. Indeed, it seems entirely plausible that 3SAT does not have algorithms running in time .

It is possible that the Unique Games conjecture, as originally stated, is true, but now it seems impossible that we could have a polynomial-time reduction establishing NP-hardness of unique games with . This is notable because such reductions exist for 2-query PCPs (i.e. Label Cover), so a hypothetical proof of the Unique Games conjecture would have to be fundamentally different from a 2-query PCP constructions.

“If one NP-complete problem had algorithms of running time 2^{n^{o(1)}}, then all problems in NP would have such algorithms,”

What is the name of this theorem? Simple NP-hardness reductions do not seem to imply this since the size of “n” (e.g. number of vertices) maybe increased or decreased polynomially in a reduction. Thus, a 2^{n^{o(1)}} algorithm for problem A wouldn’t directly imply it for problem B via a polytime reduction if problem A has, say, n^2 vertices after a reduction from problem B to problem A.

is the same as

“… and this seems very unlikely”

Why do you say this? For all we know, couldn’t they all have n^{O(1)}-time algorithms?

Believer, they might have, but that seems even more unlikely.

Let’s not mix mathematics and religion.

If eps is a constant such that 0 < eps < 1, then 2^{n^{poly(eps)}} is not 2^{n^{o(1)} since poly(eps) = Theta(1).

So could there still be some constant eps such that it is "not unlikely" that UG is NP-hard to distinguish between cases in which 1-eps fraction are satisfiable?

(Sorry, couldn’t get the formulas in my previous post to “parse”.)

It still possible, as I said in the second comment, that the original statement of the Unique Games conjecture is true, that is, for

everythere is a polynomial time reduction from an NP-complete problem, say, 3SAT, to unique games producing a gap of versus .The reduction, however, when it starts from instances of 3SAT of size , must produce instances of unique games of size , or else we would have subexponential algorithms for all of NP.

What now seems unlikely is a polynomial time reduction from an NP-complete problem to Unique Games producing a gap of versus , the sort of stronger assumption needed to prove superconstant inapproximability results for sparsest cut and other problems.

Usually, in PCP construction, if one can achieve arbitrarily small soundness with polynomial time reductions then one can also obtain sub-constant soundness also in polynomial time with PCPs of the same kind. If the Unique Games conjecture is true and NP does not have sub-exponential algorithms, this step from arbitrarily small to is impossible for unique games.

Link to the announced paper: http://www.cs.princeton.edu/~dsteurer/subexpug.pdf.