It has been too long and now there is no point telling stories from the last two days of FOCS, such as the distinguished theoretician who “flew” on the zip-line on Fremont street and then got drunk at Double Down and asked the two big scary guys who were playing pool if they were twins (they said they were).

As soon as the videos of the conference go online, however, I would suggest to everybody who wasn’t there to watch Dan Spielman’s invited talk, which was phenomenal. Dan talked about the problem of solving linear equations when the constraint matrix is the Laplacian of a graph, the result of a long-term collaboration with Shang-hua Teng. Two major parts of this research program have gone on to become their own research area:

- finding sparse cuts in graphs in time dependent only on the size of the smaller side of the cut, a problem that Andersen, Chung and Lang solve with a modified ‘Pagerank’ calculation, and that Andersen and Peres solve via an algorithm that computes a random walk over sets. It remains open to find a local algorithm with the same performance of spectral partitioning, dependent only on the conductance of the graph and not on the number of vertices;
- finding ‘spectral sparsifiers,’ a stronger object than the cut sparsifiers studied by Benczur and Karger. There are several constructions of such objects, with trade-offs between efficiency of computation and sparsity of the graphs. At one extreme, the quite amazing algorithms of Batson, Spielman, and Srivastava produces graphs of constant average degree, and, when applied to sparsify at clique, produces expanders with only twice as many edges of a Ramanujan (optimal) expander with the same expansion.

On Sunday I also attended a cryptography session. One thing that impressed me was the lively discussion at the end of the talks, very different from the stunned silence that usually follows when the session chair asks if there are any questions. The other thing I noticed was that the session was attended almost exclusively by cryptographers.

Why is that? A first guess is that the field has become very technical. But this cannot be the point; after all, a typical paper on PCP is also very technical, but the audience is not made exclusively of PCP technicians. Maybe the point is that even, or especially, *definitions* are very technical in cryptography. One can go to a talk showing that sparsest cut does not have a constant-factor approximation assuming the Unique Games Conjecture, and be fairly satisfied that he understands what it would mean for sparsest cut to have a constant-factor approximation and what it would mean for the Unique Games Conjecture to be false. Then one sees some slides with clouds of vertices connected in various ways, one hears mentions of Gaussian distributions, influence of variables, and invariance principles, and one gets lost, but with an idea that there is a reduction that needs certain complicated mathematical techniques to be analyzed.

In a cryptography talk, however, one may get started with the problem of realizing primitive X under assumptions Y1 and Y2, according to security requirement Z, with no set-up assumptions, and it would require quite some expertise to realize that requirement Z is considerably harder to achieve than similarly sounding Z’, which was known to be achievable under assumptions of Y1 and Y’2, where Y’2 is incomparable to Y2, but intuitively stronger, and so on. Consider the recent breakthrough on the long-standing very clear-cut question to achieve statistically hiding commitments assuming only one-way functions. This is a statement that is an order of magnitude simpler than the typical result in cryptography, probably the most basic question that was still open in the 2000s, but even to unpack such a statement is not easy and requires to see various examples, discussion of applications and so on.

Maybe another point is *familiarity*. I remember I used to stop listening when people would say “quantum,” while with time I have become familiar with the basic definitions and I don’t mind hearing about what’s new and exciting in quantum computing. I also used to tune out of game-theory talks, but I have finally got the notion of truthiness, or whatever is called, of an auction, the notion of equilibrium, and so on. (I still stop listening, however, when people say ‘matroid.’)

Could it be that people tune out of (or don’t go to) crypto talks because they are not familiar with the basic definitions? The last time I taught CS170, the undergraduate algorithms class at Berkeley, there was to be a lecture on RSA. For most students it was the second or third time they were going through it, so the class looked rather bored when I drew Alice, Bob and Eve, described the setting of public key encryption, defined RSA and proved that it was a permutation. Then I cautioned them not to think of RSA as an encryption scheme, just as a primitive to construct one, because when you apply it to chunks of a message written in English you are not applying it to a random number, so it might be easy to invert, if you apply it twice to the same data you get the same output, if you apply it to a message coming from a known small set the adversary can figure it out and so on. In particular, the problem of detecting when the same message is sent twice suggests that encryption must be a randomized process. This was all evidently new to them. On the way back, the colleague who was co-teaching the class with me, and was sitting in the lecture, told me that those were interesting examples, and he wondered if there is research done in crypto to avoid such problems. I think he was joking, but he said it with a very straight face.

truthiness?! Sounds like you’re learning your game theory from Steve Colbert 😀

If you look at the past 10+ years, for the most part the best papers in cryptography have gone to Crypto/Eurocrypt. Off the top of my head, I’m thinking of papers like Cramer-Shoup, the MD5 collision attack, and Boneh-Franklin’s IBE paper. (One can split hairs and try to talk about “theoretical crypto papers”, but what is not theoretical about the Cramer-Shoup or Boneh-Franklin papers?) Indeed, I think one of the problems with crypto and STOC/FOCS is that papers such as those would, I think, have a hard time getting in.

As someone who has some idea of the basic things in complexity, one problem is that cryptographers use words (e.g. block ciphers, MAC) which have equivalents in complexity but the translation can be a hard thing to do (online). Second, theoretical crypto can fail because of various subtleties and while listening to a talk, it can be hard to get the minimum assumption that is required to do a certain task.

>In a cryptography talk, however, one may get started with the problem of realizing primitive X under assumptions Y1 and Y2, according to security requirement Z, with no set-up assumptions, and it would require quite some expertise to realize that requirement Z is considerably harder to achieve than similarly sounding Z’, which was known to be achievable under assumptions of Y1 and Y’2, where Y’2 is incomparable to Y2, but intuitively stronger, and so on

__

Man, this piece is so hilarious… ha ha ha.