You are currently browsing the monthly archive for March 2010.
Contrary to common belief, Beijing does have gorgeous days of clear and blue skies; the mountains in the above picture are probably 50 miles away or more, and rather clearly visible. This is, however, a couple of days after a sandstorm.
In this trip, it was decided that I needed a Tsinghua university id card and so one dusty morning I went off with a secretary to the university HR offices. After walking about a mile (the campus is rather big), we got to the oldest and prettiest part of campus, the one where one expects administrative buildings to be. We walk into an office, where the secretary has a short discussion with the person in charge. Then we move to the office next door, where an administrator prints some documents and stamps them with official red-ink stamps. We take the stamped documents, and go three doors down, where we drop off one of them. Finally, we go to another building, where the id-making guy sits.
(This sounds a bit convoluted, compared to the American approach of going directly to the id-making person and showing him/her a driver’s license, but it is positively streamlined compared to getting anything done at the University of Rome. Incidentally, at no point was I asked to prove my identity, which seems the one important step in the process.)
The id-making guy’s office has a stool facing a professional camera and lighting, the machine to produce the id, and… a desk full of glasses. I would guess the glasses are fake (that is, have no corrective lenses) and are meant as props for taking pictures, but I couldn’t understand how. Was the idea that someone who does not wear glasses would take his id picture with glasses? So that he doesn’t look too much like his id? Then why not have fake mustaches? Or maybe it’s for people who don’t like their own glasses? But it’s not like the ones on the desk were particularly fashionable.
I elect to take my picture with naked eyes, but a new problem develops. Like the superstitious sports fan who always watches games wearing the same clothes he wore the last time his team won a big game, I was wearing the same shirt I wore the last time I had a good picture taken of me, which is the one on my home page. (For context, previously I had on my home page a picture taken at my first communion party, which was the previous picture to come out good.) Apparently, however, that shirt is “white,” and this is no good because it would reflect too much light on my face.
So I take the picture with my overcoat on, covering the offending shirt. “Good,” I am told, after taking the picture, “your eyes are open.” Apparently the previous foreigner to have his picture taken had closed his eyes when the flash came on, on three consecutive attempts. The bar for acceptable foreigner’s pictures having been so lowered by my colleague, the id is printed, and the secretary keeps it “for safety.”
On my last night in Beijing I experienced firsthand a startling Chinese driving convention. As we are stuck in traffic in a taxi, another car rear-ends us quite hard. The driver gets out, the occupants of the other car get out, and haggling begins. In this kind of situation, that is, the offending party pays some money, negotiated on the spot, to the offended party, so that everybody can drive off quicker, and not bother the police and the insurance company. The other people start at $3, and after much haggling, with the meter running, they settle for $7. You have to love a country where you can bump into another car and drive away like that. After we get to our destination, we haggle ourselves for the part of the fee wasted in waiting. We get $.50 off.
I am leaving today for Beijing, where I am once again going to spend Spring Break (as I did in 2006, the occasion that lead to starting in theory, and in 2008). On Friday I will lecture undergraduates on approximation algorithms, and next Thursday I’ll speak on time-space tradeoffs for attacks against one-way functions and pseudorandom generators. Unfortunately, I will miss by a couple of days the festivities for Serf Emancipation Day, the holiday that celebrates the emancipation of Tibetan peasants from the tyranny of the Dalai Lama (no, I am not making it up; in fairness I am coming from a country that is slowly catching up on rewriting history, and we’ll see in twenty years who is going to be ahead).
Posting here might be difficult while I am away.
At the conference I was at last week, I was asked by a mathematician about the exact statement of the Impagliazzo Hard-Core Lemma, and whether I could state it without saying “algorithm.”
So I gave the following statement:
Theorem 1 (Impagliazzo Hard-Core Lemma — Basic Version) Let be a class of boolean functions , let be an arbitrary function, and let be arbitrary parameters.
Then at least one of the following conditions is true:
- is globally well approximated by a simple composition of a few functions from .
That is, there are functions , and a “simple” function such that
- is locally almost orthogonal to .
That is there is a subset of density at least such that for every ,
The theorem has been strengthened in a couple of ways in work of Klivans and Servedio and Holenstein. The “complexity” parameter needs to be only which is conjectured (or known?) to be tight, and the density of the set can be made , which is tight. In all proofs is really simple, being more or less a threshold function applied to the sum of the .
Stating the result in this way (which is, of course, standard) raises a few questions that I find rather interesting.
I was at a conference last week, at which I heard two pieces of mathematical gossip.
One was that Arora, Barak and Steurer have developed an algorithm that, given a Unique Games in which a fraction of constraints are satisfiable, finds an assignment satisfying a constant fraction of constraints in time . This is now officially announced in an earlier (public) paper by Arora, Impagliazzo, Matthews and Steurer, which presented a slower algorithm running in time where .
I suppose that the next targets are now the approximation problems for which the only known hardness is via unique games. Is there a subexponential algorithm achieving approximation for sparsest cut or approximation for Vertex Cover?
The other news is on the Polynomial Ruzsa-Freiman conjecture, one of the main open problems in additive combinatorics. Apologies in advance to readers if I get some details wrong. In the special case of , the conjecture is that if is any function such that
then there is a matrix and a vector such that
where the probability in the conclusion is independent of and is polynomial in . Various proofs were known achieving a bound of . The first proof, due to Samorodnitsky achieves, I believe, a bound of , while the results from this paper should imply a bound of where we use the notation .
At the conference, Ben Green announced a result of Schoen implying a subexponential bound of .
The proof begins with the standard step of applying a theorem of Ruzsa to find a subset such that , and on is a “Freiman 16-homomorphism,” meaning that for every 32 elements of such that
The choice of 16 is just whatever makes the rest of the proof work. The theorem of Ruzsa works for any arbitrarily large constant. Then we consider the set of all the elements that can written as with , and we define a function on by setting
which is a well-posed definition because is a Freiman 16-homomorphism. (It would have been sufficient if had been an 8-homomorphism.) Note that for every such that we have
Now the result of Schoen implies that if is a subset of of size , then there is a subspace of dimension such that is entirely contained in .
Previously, it was known how to get a subspace of dimension , by adapting a technique of Chang (see this survey paper).
Note that is now a linear map on , and that it agrees with on . (I cannot reconstruct how this last step follows, however — maybe that claim is that agrees with on a fraction of elements of ? ) It now just remains to extend, arbitrarily, to a linear map over the entire space.
In a tutorial on average-case complexity that I gave at FOCS 2008, I mentioned four of my favorite open questions in the theory. I have some progress to report on one of the questions (#3 in this post), and on two related problems.
The question is whether a certain exponential loss is necessary in Levin’s proof that there are problems that are “NP-complete on average.” The answer is yes, it is necessary, unless the polynomial hierarchy collapses. The two related problems are whether related exponential losses in the analysis of Levin’s universal search algorithm and of Levin’s universal one-way function are also necessary. The answers are, respectively, yes unless , and yes relative to an oracle.
This is one of those now fashionable papers whose arguments are “obvious in retrospect.” (I would like to claim, however, that the proof of the result for Levin’s average-case completeness, while obvious in hindsight, is not so obvious without hindsight.) In all three cases, the exponential loss is in terms of the code length of certain programs. The idea in each case is to construct programs that contain in their code information about a specific instance of a problem of interest.