|
There’s been a lot of media attention in the last few days to a wonderful research paper on the weakness of 1024-bit Diffie-Hellman and on how the NSA can (and possibly does) exploit this. People seem shocked about the problem and appalled that the NSA would actually exploit it. Neither reaction is right.
In the first place, the limitations of 1024-bit Diffie-Hellman have been known for a long time. RFC 3766, published in 2004, noted that a 1228-bit modulus had less than 80 bits of strength. That’s clearly too little. Deep Crack cost $250,000 in 1997 and cracked a 56-bit cipher. Straight Moore’s Law calcuations takes us to 68 bits; we can get to 78 bits for $250 million—and that’s without economies of scale, better hardware, better math, etc. Frankly, the only real debate in the cryptographic community—and I mean the open community, not NIST or the NSA—is whether 2048 bits is enough, or if people should go to 3072 or even 4096 bits. This is simply not a suprise.
That the NSA would exploit something like this (assuming that they can) is even less surprising. They’re a SIGINT and cryptanalysis agency; that’s their job. Tell me that you don’t think that SIGINT is ethical (shades of Stimson’s “gentlemen do not read each other’s mail”), but that the NSA would cryptanalyze traffic of interest is even less of a surprise than that 1024-bit Diffie-Hellman is crackable.
There’s also been unhappiness that IPsec uses a small set of Diffie-Hellman moduli. Back when the IETF standardized those groups, we understood that this was a risk. It’s long been known that the discrete log problem is “brittle”: you put in a lot of work up front, and you can solve each instance relatively cheaply. The alternative seemed dangerous. The way Diffie-Hellman key exchange works, both parties need to have the same modulus and generator. The modulus has to be prime, and should be of the form 2q+1, where q is also a prime. Where does the modulus come from? Presumably, one party has to pick it. The other party then has to verify its properties; the protocol has to guard against downgrades or other mischief just in agreeing on the modulus. Yes, it probably could have been done. Our judgment was that the risks weren’t worth it. The real problem is that neither vendors nor the IETF abandoned the 1024-bit group. RFC 4307, issued ten years ago, warned that the 1024-bit group was likley to be deprecated and that the 2048-bit group was likley to be required in some future document.
By the way: that these two results are unsurprising doesn’t mean that the Weak DH paper is trivial. That’s not the case at all. The authors did the hard work to show just how feasible this is, which is not the same as “not surprising”. It quite deserved its “Best Paper” award.
Sponsored byVerisign
Sponsored byVerisign
Sponsored byCSC
Sponsored byDNIB.com
Sponsored byWhoisXML API
Sponsored byIPv4.Global
Sponsored byRadix