|
In the debate over government “exceptional access” to encrypted communications, opponents with a technical bent (and that includes me) have said that it won’t work: that such a scheme would inevitably lead to security problems. The response—from the policy side, not from technical folk—has been to assert that perhaps more effort would suffice. FBI Director James Comey has said, “But my reaction to that is: I’m not sure they’ve really tried.” Hillary Clinton wants a “Manhattan-like project, something that would bring the government and the tech communities together”. More effort won’t solve the problem—but the misunderstanding lies at the heart of why exceptional access is so hard.
The Manhattan Project had to solve one problem. It was a very hard problem, one they didn’t even know could be solved, but it was just one problem. Exceptional access is a separate problem for every app, every service, every device. Possibly, a few will get it right. More likely, they’ll fail even more abysmally than they’ve failed at even simple encryption. Study (“Developers have botched encryption in seven out of eight Android apps and 80 percent of iOS apps”) after study (“10,327 out of 11,748 applications that use cryptographic APIs—88% overall—make at least one mistake”) after study (“root causes are not simply careless developers, but also limitations and issues of the current SSL development paradigm”) after study (“We demonstrate that SSL certificate validation is completely broken in many security-critical applications and libraries.”) have shown the same thing: programmers don’t get the crypto right—and these are primarily studies of apps that use standardized and well-understood protocols and APIs.
Oppenheimer and company had the advantage of effectively unlimited resources. When confronted with two design choices, they could try both. This let them avoid dead ends, such as trying to build a gun-type bomb with plutonium. They could try gaseous diffusion, thermal diffusion, centrifuges, and electromagnetic separation for uranium enrichment. (The latter required more than $1 billion dollars of silver wire—and they got it.) App developers don’t have that luxury. Even if one or two do, they don’t share their source code with each other. Besides, most developers don’t know if they’ve gotten it right or wrong; the failure mode here is silent insecurity. Most of the time, holes like these are found by people who do a serious penetration study—and these are generally the attackers.
One size doesn’t fit all when it comes to cryptography. That’s why cryptographic APIs are so complex. We can’t solve the exceptional access problem once and for all, and individual efforts won’t suffice for particular cases.
Sponsored byIPv4.Global
Sponsored byVerisign
Sponsored byRadix
Sponsored byWhoisXML API
Sponsored byCSC
Sponsored byDNIB.com
Sponsored byVerisign
I think a much better answer to the ‘try harder’ demand is to point out that the government cannot agree what the access criteria would be. Machines can only follow instructions. If the requirements can’t be stated, there isn’t a solution.
Does the FBI want law enforcement in Europe to have backdoor access or not? If Europe is in, then what about Turkey, Israel and Saudi Arabia? How about Russia and China?
What none of the advocates of backdoors are willing to face is that this is not a problem of providing a particular technology to the US government, the actual problem they need solved is how to deny access to a technology that has been freely available for over forty years.
That said, even if it was possible, the continued presence of a gulag in Guantanamo, the failure to prosecute any of those responsible for the systematic use of torture at Abu Ghraib, the fact that rather too many members of the establishment excuse the toppling of democracy in Iran, Chile, etc. etc. these are the reasons why I oppose backdoors and the reasons why I am working to destroy as much of the apparatus of pervasive surveillance as possible.
A country that looks set to nominate Trump for President should not be trusted with an army, let alone a pervasive surveillance system covering the planet.
I think there’s another lesson to draw from the Manhattan Project.
Once (most) of the scientists realised what they were building, they didn’t want to do it, and the more time elapsed the more of them regretted the outcome.
Eventually the political class caught up and we are now trying to undo history: today most sane politicians would prefer that nuclear weapons simply didn’t exist: it was Reagan who signed START I.