|
To those of us who have worked on crypto policy, the 1990s have become known as the Crypto Wars. The US government tried hard to control civilian use of cryptography. They tried to discourage academic research, restricted exports of cryptographic software, and—most memorably—pushed something called “escrowed encryption”, a scheme wherein the government would have access to the short-term keys used to encrypt communications or stored files.
The technical community pushed back against all of these initiatives. (One side-effect was that it got a number of computer scientists, including me, professionally involved in policy issues.) Quite apart from privacy and civil liberties issues, there were technical issues: we needed strong cryptography to protect the Internet, compatibility meant that it had to be available world-wide, and simplicity was critical. Why? Most security problems are due to buggy code; increasing the complexity of a system always increases the bug rate.
Eventually, the government gave up. The need for strong crypto had become increasingly obvious, non-US companies were buying non-US products—and no one wanted escrowed encryption. Apart from the fact that it didn’t do the job, it did increase complexity, as witnessed by the failure of one high-profile system. There were many papers and reports on the subject; I joined a group of very prominent security and cryptography experts (besides me, Hal Abelson, Ross Anderson, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, and Bruce Schneier) that wrote one in 1997.
The question of strong cryptography appeared to be settled 15 years ago—but it wasn’t. Of late, FBI director James Comey has issued new calls for some sort of mandatory government access to plaintext; so has UK Prime Minister David Cameron. In fact, the push is stronger this time around; in the 1990s, the government denied any intention of barring unescrowed encryption. Now, they’re insisting that their way is the only way. (President Obama hasn’t committed to either side of the debate.)
It’s still a bad idea. The underlying problem of complexity hasn’t gone away; in fact, it’s worse today. We’re doing a lot more with cryptography, so the bypasses have to be more complex and hence riskier. There are also more serious problems of jurisdiction; technology and hence crypto are used in far more countries today than 20 years ago. Accordingly, the same group plus a few more (Matthew Green, Susan Landau, Michael Specter, and Daniel J. Weitzner) have written a new report. Our overall message is the same: deliberately weakening security systems is still a bad idea.
Section 4 is especially important. It has a list of questions that proponents of these schemes need to answer before opponents can make specific criticisms. In other words, “ignore this report; that isn’t what we’re suggesting” can’t be used as a counterargument until the public is given precise details.
Sponsored byRadix
Sponsored byIPv4.Global
Sponsored byDNIB.com
Sponsored byVerisign
Sponsored byWhoisXML API
Sponsored byVerisign
Sponsored byCSC