The September issue of the Notices of the AMS is out, and it contains an article by Neal Koblitz on modern cryptography, exposing themes he wrote about in previous articles.
Before I get to Koblitz’s thesis I should describe the context, as I see it.
Cryptography underwent two major revolutions in the 1970s and 1980s.
The notion of public key cryptography, invented by Diffie and Hellman (and earlier, but only in classified documents that didn’t enter the public domain for decades, by Ellis, Cocks, and Williamson) and made possible by Rivest, Shamir and Adleman, allowed parties that had never met in advance to share a secret key to communicate over an unsafe channel. Without this technology, buying and selling things online would be extremely inconvenient and companies like amazon and ebay would probably not exist.
The other revolution, started by the 1982 series of papers by Blum, Goldwasser, Micali and Yao, was the discovery that one could provide formal definitions of security for cryptographic problems, and that such definitions were achievable under complexity assumptions, albeit, initially, via slow constructions.
Indeed, the importance of the new definitions cannot be overstated, and, possibly for lack of accessible expositions, it has not been completely digested by all the security community. I remember, not too long ago, reading a paper on electronic elections, listing seven or more requirements that an election protocol should satisfy, and it was clear that the list was unwieldy, redundant, and, most importantly, incomplete. The modern definitional approach is instead to begin with a description of an ideal setting (where every person votes in a secret ballot, then all ballots are counted and the total tally is the only information that anybody gets) and then require that a protocol be such that anything an adversary can do in the protocol to alter the outcome or gain information, can also be done in the ideal setting. In particular, whatever outcome or information an attacker cannot gain in the ideal setting, it cannot be gained in the protocol either.
Some constructions developed by theoreticians and coming with a formal analysis are too inefficient to be used, but their development often leads to the discovery of general design principles such that, for example, public key encryption algorithms must be randomized, and should be designed so that it is not possible to construct a valid ciphertext in any other way than applying the encryption algorithm to a known message.
Indeed modern cryptography is the area of computer science where the distance between theory and practice is the least: one finds theoreticians who spend most of their time on impractical constructions designed to be “plausibility results” but who also sit on standards bodies and help create and assess widely used systems, whereas one is less likely to see the algorist preoccupied with $O(log n)$ approximation algorithms also working on commercial optimization packages. The important difference is that optimization algorithms can be validated in practice in a way that is impossible for cryptographic protocols, where one cannot anticipate what attacks will be used, and hence one should strive for security against all possible attacks which is possible, within an attack model, only via a formal analysis and reductions.
Koblitz points out that sometimes proofs contain mistakes, and that there can be attacks not covered by standard models. His reaction, however, is not that the community should be very careful about formal correctness, and explore (as is being done) new models that take into account timing attacks and other “grey box” accesses to the computations of the parties. Rather, he suggests doing away with proofs, and relying more on intuition. This is discussed in the second part of the paper (the first part is devoted to encryption schemes and factoring algorithms via elliptic curves, the “good” interaction between math and cryptography), through such rhetorical devices as non sequiturs, personal attacks, and petulance. The CRYPTO community’s typesetting abilities are not spared, nor is Oded Goldreich’s spelling.
It would seem hard to defend the idea that one is more likely to make a correct statement if the statement has no proof compared to having a proof, or that one can be secure against a wider class of attacks by relying on intuition rather than defining a class of attacks and establishing the security guarantee. It would be like having a draft-dodger compete in an election against a war hero, and having the latter be on the defensive about his military service, but sometimes strange things do happen.
(For the Italian readers, a better reference would be from Nanni Moretti’s Aprile: “Di qualcosa, D’Alema rispondi. Non ti far mettere in mezzo sulla giustizia proprio da Berlusconi! D’Alema, dì una cosa di sinistra, dì una cosa anche non di sinistra, di civiltà!”)
Update 9/1/07: you can now read letters to the editors by Oded Goldreich, Boaz Barak, and Jon Katz, and there are more online comments [here] and [here]
Update 9/5/07: Hugo Krawczyk has also written a letter to the editors of the Notices. The interested reader can compare what Koblitz said about Hugo’s work on HMQV to the actual HMQV paper.
I was upset enough at the original set of articles by Koblitz and Menezes, but thought that they at least had some valid points (though the presentation was much more strident than necessary).
This article is unjustifiable. The appearance of this article in the Notices of the AMS is a disgrace, and a true step backwards if the goal is to involve more mathematicians into “modern” cryptographic research.
I hope many TCS researchers will join me in writing a letter to the editor in complaint.
as an aside, the part of the article on the “two cultures” of math and crypto/CS research, reminded me of the following joke due to Noga. long ago, i asked him if he thought of himself more as a mathematician or a computer scientist — his response was “Since I don’t submit papers at the last minute, I suppose I am not a computer scientist.”
aravind
Nice little cryptographic batrachomyomachia.
The editors could/should have rejected this paper based on the first sentence alone (as it is as far from science and objectiveness as can possibly be):
“During the first six thousand years – until the invention of public key in the 1970s …”
Unless, of course, they opted to leave the opening as a big warning sign for non-expert readers.
Are you aware that Koblitz is (or at least was) vocally leftist? For instance, one of his books is dedicated to the Central American freedom fighters (i.e. anti-US), or something very similar. Your swift boat analogy will probably sting. Perhaps that was the point.
The article is a travesty, but the last para does indicate a degree of self-awareness. Even if one were to grant the achievements of the community, one could hope for research in the area to be conducted in a better spirit…
At the time, I also thought that the original article on eprint had some good points, and made me ponder about the nature of science and the goals of modern cryptography. This thing in the notices is however completely ridiculous. I thought the whole point of the article was to critique the meaningfulness of provable security results in the “real world.” Depending on what Koblitz really believes, it might have made sense for him to write an article aimed at practitioners with the title “The uneasy relationship between cryptographic theory and practice” (although ‘cryptography is really too broad a term here, and such an article would seriously undermine the goals of provable security in my opinion). As it stands, I have no idea why this article was published in the notices and why mathematicians should care about such issues. Also, a lot of the content wasn’t about theory vs. practice at all, and seemed just plain insulting to the cryptographic community.
The article is bogus on many levels. Yes, sometimes people doing “provable security” make mistakes. It’s normal in all sciences, and is quite rare too make such a big deal out of it. Yes, as a young field, there is a lot of competition and people often publish incremental papers, and also publish often. But our best conferences (CRYPTO/Eurocrypt/TCC) still have very good papers modulo a small percentage of exceptions. And many small ideas deserve to be published. In fact, there are only a handful of really amazing results. So if everybody always waits until they prove Fermat’s big theorem, very few people will have papers period.
The attack on science that intuition is better than proofs – even if proofs are only conditional, and sometimes hold in restricted models – is also totally ridiculous. The complaint about conference vs. journal culture is silly for a young field: there is no time to extensively publish and review papers. I’m sure that in 200 years, cryptography will become closer to topology or other math disciplines. As such, it will become harder, but also more “boring”. But, until this happens, let’s enjoy the moment of rapid progress and continuous excitement!
Next, I agree the term “provable security” is a bit overstating. But this is fair marketing. It is very important for cryptographers to capture the attention of not only math society, but also practitioners (i.e., “security” community). So some marketing is important, or else they ignore us. Yes, we are far from proving unconditional results (although there are nice exceptions, like the work bounded storage model, biometrics, etc.). But the fact taht we cannot prove P \neq NP does not mean we have to stop science. So I view little harm in a slightly catchier term than the reality. As long as we are honest about it.
Finally, in my opinion, the whole reason for the paper and all the personal attacks is the commercial interest of Koblitz in the MQV protocol. I think the whole thing started when Hugo Krawczyk improved MQV (the HMQV paper). True, he had a minor proof mistake and some of his attacks are theoretical in nature. But this does not mean that one should not change the informally secure MQV into a formally secure HMQV, which is BETTER, if only at the theoretical level at this stage. I agree MQV should have been asked to review it, but I’m not sure it’s necessary. Neither of them was doing provable security for a while, so they could say little on a technical level. On the other hand, they all had commercial interest to be negative and exaggerating (which is what happened!), so maybe it’s good nobody asked their biased opinion :).
In any way, the silly article already did some harm. So we should think of what we can do to prevent people like Koblitz from amateurish and silly attacks like this paper!
I disagree with Koblitz
in a lot of things, but
I agree in one thing with him: provable security research contains too much hypes. I donot mind that
sb works in provable security, but if it becomes
mainstream, it will harm
cryptography.
There are cryptographers who work hard without publishing and end
up with big results like
the recent breakthrough
in hash functions. The sad thing is that those good works are flooded by
papers in provable security in CRYPTO/Eurocrypt conferences.
This article (and the preceding essays) are so full of misrepresentations and exaggarations that they should not be ignored, as some people might take them seriously. I sent the following letter to editor of the notices (of course space does not permit to go over each individual claim and explain why it’s either false or overblown)
To the editor,
In the famous joke, a mathematician would not infer the color of a sheep’s right side from its left side. But Neil Koblitz, in his article on “The Uneasy Relationship between Mathematics and cryptography” makes quite a few broad generalizations from a handful of anecdotes.
I found particularly misleading Koblitz’s comments on proving security of cryptographic protocols, which he calls an “over-hyped” idea that only leads to “false confidence”. Proofs of security of cryptographic protocols are standard mathematical proofs and in that sense are no more over-hyped or give false confidence than proofs in calculus. Of course, as with any mathematical statement, three questions arise: (1) What is exactly the statement being proven? (2) If the statement is an implication, do we have reason to believe that the antecedent is true? and (3) Is the proof correct?
In this note and his previous essays with Menezes, Koblitz finds cases where security proofs are lacking in one or more of these points, and generalizes these to a blanket criticism of security proofs. Even though I agree there might be a bit more “crazy” assumptions and incorrect proofs in cryptography than other areas of mathematics, this is to be expected in such a young and vibrant field. As time progresses we should see more repeated validation of central results, and gain better understanding on which ssumptions are solid and which are not. (Perhaps eventually some of these assumptions will be proven,
although at the moment they seem to be as hard as mathematical questions that went unresolved for centuries.)
As in any mathematical discipline that attempts to model reality, Point (1) -the meaning of the statement being proven- is where cryptography has an inherent difficulty. We all know that the impossibility of angle trisection depends on the precise definition of allowed operations, but none of us relies on this result to protect our credit card information. Here indeed, cryptographers have sometimes misstepped and inadequately modeled the scenarios in which their system could be attacked, leading to systems that regardless of whether they have proofs of security, are in fact insecure in practice. But the problem is not inherently with proofs of security but rather with cryptography itself, a notoriously difficult subject which over its long history has seen many great minds miss some subtle points and design systems that were eventually broken.
In fact, the only way to systematically improve the security of systems is to insist on precise modeling and definitions, and then to study these definitions using mathematical proofs, on the way identifying and correcting subtle weaknesses in protocols. Indeed, Koblitz’s anecdote on the MQV and HMQV protocols demonstrates precisely how careful definitions and insistence on proofs can direct an incremental process towards more secure protocols.
Best,
Boaz Barak
Assistant Professor of Computer Science
Princeton University
In his earlier diatribes with Menezes, Koblitz used to have a valid point to make (that went beyond the supposed value of intuition over theorems and ad hominem attacks). Namely, that despite even the precise ways that crypto theorems have been stated since the mid 1990’s, many of these bounds are so small as to be meaningless when practical parameters are entered so their value has been overestimated.
In trying to make his point that cryptography brings corruption (more precisely, the money that cryptography brings produces the corruption) the valid points have been removed and only the bile remains in a disjointed collection of anecdotes.
TCS community is quick to bash Koblitz for exaggerating.
TCS ignores Koblitz very valid criticism: TCS churns out a lot of low quality, low content, high hype drivel. How dare he criticize our right to pad our publication count and ability to pander to get grants!
(There is of course, plenty of good stuff TCS does, but the signal/noise ratio is pretty embarassing.)
To anonymous 12, the community is criticizing Koblitz for lying and misrepresenting facts and for his vicious personal attacks against anyone who ever voiced any doubt about his work, not for talking about the quality of submissions to IACR conferences (the majority of which are not in theory!).
His “vision” for the future of the design of cryptographic protocols is ridiculous: do not define anything and just rely on intuition and pray. This has proven to be a very bad strategy in the past, as the (real) MQV story demonstrates.
TCS ignores Koblitz very valid criticism: TCS churns out a lot of low quality, low content, high hype drivel. How dare he criticize our right to pad our publication count and ability to pander to get grants!
and what, exactly, would you consider to be examples? Please give us some explicit citations!
Surely if the noise to signal ratio is so high you can ANONYMOUSLY point out a dozen or so papers from the most recent FOCS/STOC/SODA/CCC/CRYPTO that you can soundly argue are “low quality, low content, high hype drivel”.
— the Driveler
Provable security is a disappointment if you expect to have a single paper that suggests a problem, formally defines it, comes up with a practical protocol that can be implemented from the paper, and has a proof of security that yields strong bounds for practical key sizes.
This is almost never the case. Rather, there will be a sequence of papers refining the definition, tightening the proofs, and dealing with the existence of protocols at varying degrees of abstraction.
This is not because computer scientists want to pad their CV but rather because zeroing in on the right definition and finding good protocols and tight proofs take time, and often requires different people with different skills.
This may be a cumbersome process and we can of course get “stuck” at some point, with a huge gap between the protocols we can analyze and the efficiency we need for practice. Also, as things are implemented one can notice certain side channels and conditions and need to refine the definition further.
But as far as I know, this is the best process we have, and the alternative of relying on smart people’s intuition was tried for thousands of years, and always led to broken codes. (Even though these codes were much less ambitious than the public key crypto, CCA security, secure election protocols, and other crypto applications we have today.)
Boaz Barak
“It would be like having a draft-dodger compete in an election against a war hero, and having the latter be on the defensive about his military service, but sometimes strange things do happen.”
Exactly in what sense is Koblitz a draft dodger putting war heroes on the defensive? Are accusing him of proving theorems while simultaneously advocating the opposite for the TCS crypto community?
The analogy is that Koblitz attacks research in the foundations of cryptography precisely for what are in fact its merits, the striving for precise definitions and proofs.
Attacking an opponent on his strengths may seem an absurd debating strategy, because one cannot rely too much on fact.
It is, however, very effective, because it is very hard to reply to it.
One cannot civilly say “you make a number of valid points about my weaknesses, but let’s talk about my strengths…” because it’s now the strengths themselves that are the subject of the debate.
One then has to point out that the original argument is non-sensical, but then it becomes a “controversy,” and in a controversy an observer expects both parties to be biased, and the truth to be in the middle. But the middle between non-sense and reality is a half-truth, and this is where it ends eventually.
I agree with part of the analogy,
but I don’t think Koblitz deserves to be called a draft dodger ;).
As far as quality of research goes, he compares favorably with most war heroes in the TCS crypto community.
No, no, Koblitz is not a draft dodger, he is a swift vet. The draft dodger is the notion that one can develop secure cryptosystems without precise definitions and proofs. And researches in the crypto theory community are not war heroes, they are Bob Shrum.
(For the Italian readers: “E i radicali?” “I radicali sono la finocchiona.”)
Koblitz is a draft dodger wrt providing methods for assessing the security of proposed cryptographic protocols. Whatever you say about the theoretical community in cryptography, they are very precise about providing the sense of security which their proofs establish (certainly if one reads beyond the title of a paper).
Koblitz and his cohort on the other hand refuse to define their notion of security, let alone prove anything meaningful about those protocols. Koblitz, now playing the part of Karl Rove, then attacks the theoretician who worked hard on their analysis (see e.g. http://eprint.iacr.org/2005/176) for what is a minor and easily correctible mistake. Koblitz does not make such mistakes since he does not prove anything about his protocols.
I posted here a letter that I submitted to the Editor of the Notices of the AMS in response to the recent article by Neal Koblitz in the Notices. The letter is intended to tell, as concisely as I can, the REAL HMQV story, and through it to tell the story of the amazing success of the Theory of Cryptography (TOC). This success is reflected not only in the rigorous mathematical foundations that TOC has laid for Cryptography at large, but also in its ability to guide us in the design of truly practical solutions to real-world problems. The HMQV design is a very good example of the latter aspect of TOC, one that Koblitz tries to negate via personal attacks and ridiculing of the whole field. There is no need to take my word on this, the HMQV paper is available for anyone to read, verify and judge.
-Surely if the noise to -signal ratio is so high -you can ANONYMOUSLY -point out a dozen or so -papers from the most -recent -FOCS/STOC/SODA/CCC/CRYPTO -that you can soundly -argue are “low quality, -low content, high hype -drivel”.
I would say that all
the recent provable security papers are “low quality, low content, high hype drivel”. Sorry if it
offends you.
I would say that all
the recent provable security papers are “low quality, low content, high hype drivel”. Sorry if it
offends you.
You’ve read all the recent provable security papers? I don’t even read all the papers in areas I like.
As an example, why don’t you tell use waht’s low quality, low content, high hype in the HMQV paper?
This is not because computer scientists want to pad their CV but rather because zeroing in on the right definition and finding good protocols and tight proofs take time, and often requires different people with different skills.
Recently, I’ve done some work in a field outside Crypto in which exactly this sequence needed to take place. We were refining a wanting definition which in turn was refining a previous one and so on. The process seems to be converging in that the field gets progressively stronger, more realistic results. The referees went “Koblitz” on us and rejected the paper for being incremental, only one in a long chain, and the new model likely not being the ultimate word on the subject. The comments are all true by the way. What Koblitz and the referees failed to notice is what Boaz points out. Certain solutions require the input of many different perspectives before the field as a whole can zero in on the right model and definitions, e.g. what are possible reasonable and unreasonable attacks.
Congratulations! It looks we have a new little controversy that we should cherish and develop.
1) mathematics, rigorous and non-rigorous.
Somehow, contrary to Scott’s reaction, and Michael’s title of his post, it is Oded Goldreich who represents the “pure mathematics” stance. For example, in his manuscript “On post-modern cryptography” Oded criticized Koblitz and Menezes for saying: “theorem-proof paradigm of theoretical mathematics is often of limited relevance” (in cryptography).
Mathematical Rigor is a theme of quite a few other controversies and discussions around. (E.g. the little one regarding statistical physics approaches to complexity theory and the larger one regarding string theory and physics.)
There are many areas of mathematical nature (including of applied mathematics and certainly physics) where heuristic methods, numerical methods, methods based on simulations, and methods based on various calculation procedures not based on rigorous math, are important and often cannot be replaced by purely rigorous mathematics.So I do not think the idea that cryptography theory can also be developed outside the theorem-proof paradigm can be dismissed out-of-hand.
But, of course, the question is if Koblitz, Manezes or anybody else provide useful ideas how to develop academic cryptography outside the theorem-proof paradigm.
2) theoretical CS
As Luca explained in his post, the rigorous mathematical developments of modem cryptography and putting the foundation of cryptography on formal rigorous grounds is an amazing intellectual achievement, that changed the area of cryptography and led to major surprising developments in complexity theory. Moreover, this is precisely the place were academic scholarly research has an advantage over the way cryptography is done in other places.
3) The asymptotic paradigm
Another part of the controversy is the asymptotic paradigm of theoretical computer science. Of course, it is better to compute a function (say, the running time of an algorithm in the worst case) explicitly rather than just understanding its asymptotic behavior. But the asymptotic point of view is responsible to many of the great achievements in mathematics and in theoretic computer science. One main reason for the success of the asymptotic point of view is that it is feasible in many cases where a complete understanding is unfeasible.
The debate about insights gained by asymptotic study in not new. Persi Diaconis has a well known critique of theoretical-computer-science style (or Erdos-style) asymptotic bounds especially those were all constants are neglected.
Understanding behavior of functions asymptotically and using polynomial reductions sometimes give bounds which are not very telling for practical values of the parameter. For example, improving the running time for an algorithm for factoring an n-digit number from exp n^1/2 to exp n^ 1/3 was quite sensational, but much less sensational from the point of view of complexity theory than an improvement to exp ((log n)^3) (not to say n^20). The later improvements are practically irrelevant.
One interesting teaching (or intuition) of theoretical computer science for which there is a lot of evidence is the remarkable effectiveness of the asymptotic insights.
We tend to believe (and if you want this is an insight of theoretical CS supported by non-rigorous intuition,) that the asymptotic results tell more than what we can rigorously deduce from them.
Here is an example: What is the best “explanation” we have for the fact (observed in the early 50s) that the simplex algorithm for linear programming is so efficient?
In my opinion it is the theorem proved by Khatchain that LP is in P. The relevance of LP being in P is not a formal mathematical consequence of Katchian’s theorem. The bounds Khatchian obtained and the ellipsoid method he used are practically terrible. And the ellipsoid method is quite different from the simplex method. Still LP being in P is the crucial fact. (There are other good explanations and they also are based on asymptotic insights.)
4) Related debates.
Let me mention two related little debates. The first is on the question of “what is randomness”.
Theoretical CS offers, in my opinion, deep and profound insight into randomness. (Based on complexity theory and based on the asymptotic paradigm. (And of course on theorem-proofs methodology.)) This does not mean that the various approaches from statistics to randomness are obsolete.
The second, already mentioned, is concerning the usefulness of statistical physics (often non-rigorous) methods in complexity theory.
5) rhetorics and philosophy.
Certainly the rhetoric of Neil and of Oded make this controversy very promising.
Both Neil and Oded have prior experience in controversies. Neil Koblitz was quite prominent in the interesting controversy regarding Huntington’s usage of mathematics (or at least mathematical language) in political science (which deviated quite a bit from the
theorem-proof paradigm). Oded Goldreich has a strong stance regarding quantum computers.
Koblitz rhetorics is strong and it is a little strange to see his paper published unedited in the Notices. (But there are precedences for similar Notices’ papers.) Oded’s rhetorics is perhaps even stronger and the terms he used do not seem convincing (post-modernism? fetishism??) But his paper is only an Internet publication. The reference to king Hezekiah is quite hilarious.
One nice thing about strong rhetorics is that it leads to new nice context-free universal debate tools. For example, Oded refer to Koblitz and Menezes approach as fetishism. This seems to be quite novel, I never saw it used before in debates.(The term itself can be used quite universally. As much as Oded can blame Neil for being intuition-fetish Neil can blame Oded for being theorem-proof fetish.)
Overall, Oded’s paper is a philosophical piece and it gives another reminder (along with several posts on Scott’s blog and this one) that modern theoretical CS has much to offer to philosophy.
The difference between cryptography and other areas such as mathematical physics, heuristics, etc.. is that no one has a good way to “simulate” a system to find out if its secure.
So it seems that proofs are much more needed in cryptography than in almost any other domain.
On the “swift boat”: it is not Koblitz that is the draft-dodger, it is the stance he advocates. No-one can deny that Koblitz has enriched the field with many substantial original contributions, but when we’re talking about how to gain confidence in our cryptographic protocols, proof versus intuition should be a one-sided battle, and that’s where the analogy applies.
For the “Reduced Neal Koblitz” see David Eppstein’s post http://11011110.livejournal.com/114876.html
Very funny and refreshing
Regarding the use of the word “fetishism” in Oded’s essay, and on rhetorics: Oded did *not* write that the advocating of intuition is a fetishism, nor that the attack on the theorem-proof paradigm is a fetishism. Oded wrote that the use of the *Random Oracle model* is a fetishism, and with regard to this issue the reference to king Hezekiah does make sense.
Koblitz missed (or ignored) this point and claimed that Oded blames him for fetishism, but this is not the situation.
NSA crypto legal battles continue
http://www.prosefights.org/nmlegal/dcvoid/dcvoid.htm#feehan3
1) More comments on Neil’s Notices piece
The first half is a very nice personal description of the history of elliptic curve cryptography. Indeed this is a very remarkable story. One thing to note is that one of the figures who contributed to this area Gerhard Frey is a truly mathematical hero of our time: He is the person who got the ball rolling for the proof of Fermat last theorem (via connections to elliptic curves).
The second half of Neil’s paper is a missed opportunity for Neil to present his ideas on cryptography. Instead he engages in side issues, unconvincing claims, complaints and jokes.
2) More comments on Oded’s “post-modern cryptography” piece
It is quite possible that Koblitz and Menezes ideas do not merit an answer (mainly because what they say seems rather undeveloped and vague,) but probably an essay which is mainly philosophical cannot do the job of debunking such claims. It looks that scientific controversies cannot be decided based on philosophical arguments.
Oded’s philosophical paper is nevertheless quite interesting. Describing in more details how cryptography became a science (in a process which can be described as putting vague intuitive insights on formal grounds) and the amazing concepts that emerged and are mentioned in the paper (like zero-knowledge proofs) on the expense of less anti Koblitz-Menezes polemic can lead to a very nice paper.
3) A few comments on Koblitz and Menezes “Another look at ‘provable security'” paper. (From Neil’s homepage I do not know if this is identical to the journal version.)
3.1 ) Abstract
The abstract contains some strong claims which are quite remote from what the rest of the paper is about.
Compare, for example the abstract with the conclusion Section 7. The four points of the conclusion section are quite mundane and are completely different than what the abstract promises: Here is a typical item from the conclusion:
“Currently the optimal RSA type encryption is the Boneh-Rabin ‘simplified OAEP’ “.
Well, this is a legitimate opinion and the authors gave some intuitive arguments to support it and rejected some intuitive arguments to the contrary. How is it related to the claim
from the abstract against the proof-theorem paradigm and about papers using this paradigm being “frequently misleading”?
Overall the abstract is rather misleading.
3.2) The paper is not at all appropriate
for a large audience of mathematicians as is promised in the abstract.
3.3) How to distinguish “psychological intuition” from “logical intuition?”
Regarding the Boneh-Rabin suggestion that paper describes an intuitive argument for its superiority and rejects a (“psychological”) intuitive argument for caution as being illogical. If Koblitz and Menezes can develop a clear methodology to distinguish between psychological intuitions and logical intuitions this will be a great achievement much beyond the scope of cryptography.
3.4 ) As comment #26 pointed out, there are reasons for additional importance of the theorem/proof paradigm in cryptography compared to other areas. Take the example of algorithms. (Where also the theorem/proof paradigm is quite dominant in their academic study.) When planning an algorithm there is no reason to assume that we are facing malicious adversaries who would like to fail our algorithm or to fail us into using a bad algorithm. An assumption of malice of this kind is quite reasonable when it comes to applied cryptography. Suppose that an analog of the “random oracle” was used to test heuristically the performance of an algorithm. There was no reason to assume that somebody (outside academia) will intentionally develop algorithms for which this methods fails. This is not the case when it comes to applied cryptography.
3.5) The claim that in mathematics proofs are checked more carefully in pure mathematics than in cryptography or theoretical computer science look baseless and incorrect.
Finally, without being fanatic to the theorem-proof paradigm it is not clear what Neil and Alfred suggest as an alternative. Take an argument like this:
“When we put the two problems RSA1 and RSA2 side by side and compare them, we see that it defies common sense to suggest that in practice someone would find RSA1 easier to solve than RSA2”.
What is the paradigm here?
Real world practical cryptography means implementation that works.
NSA liked hardware, not software implemenation … for good reason.
Too easy to spike software implemenations.
http://www.jya.com/nsasuit.txt
http://www.aci.net/kalliste/speccoll.htm
NSA algorithms I viewed, were simple to implement in hardware, and worked.
My opinion is that the crypto problem was solved in 1918.
http://www.aci.net/kalliste/bw1.htm
regards
http://www.prosefights.org/muxnumber/numberconversion/numberconversions.htm
I must say I agree with Koblitz and Menezes in part. And I find they provided constructive critics on the future of Cryptography and Theoretical Computer Science. The thing that motivated all this was the petulance of Hugo Krawicz, who proposed a HMQV protocol and claimed that Menezes results were not good based only on his provable security assumptions. And MQV proposed by Menezes et. all is actually been used by NSA in practical implementations. Krawicz and other CS should be humbler . Result: Menezes by means of a technical article (2007) proposed a improved protocol and showed the many mistakes Krawicz’ work had.
Koblitz and Menezes are very brave. Good lucky for them.
PS: I am a graduating EECS student and my researc is focused on Multi-Party Computation, and even though I think Koblitz’ critics are relevant to build the future of Theoretical Crypto.
I study Provable Security and Koblitz’s papers are much more fun to read than Goldreich’s work.
In his philosophical paper he mentions the Old Testament. And this is a Scientific discussion, not religious.
Goldreich’s papers are really difficult to read. I wish more cryptographers in Foundations of Cryptography were able to write so well as Koblitz does.
I tend to agree with comment #35.
I find Goldreich’s work extremely difficult and boring to read.
1) I spend a majority of my time flipping back and forth following his cross-referencing instead of actually understanding something. I guess his proofs are the perfect candidates for automated verification.
2) Goldreich always tends to over-hype his work using flashy titles. (but then, who doesn’t these days?)