|
Technologists and law enforcement have been arguing about cryptography policy for about 30 years now. People talk past each other, with each side concluding the other side are unreasonable jerks because of some fundamental incompatible assumptions between two conceptual worlds in collision.
In the physical world, bank branches have marble columns and granite counters and mahogany woodwork to show the world that they are rich and stable. This works because that kind of building is slow and expensive to construct. Even if something with marble and mahogany is not exactly a bank, it is still likely to be rich, stable, and bank-like. But on the Internet, whatever a bank can do on their website, bored teenagers in Moldova can copy, which is one of the reasons we have so much phishing. People’s assumptions about what banks look like not only fail, but they don’t fail a little bit; they fail catastrophically.
In the physical world, when things fail, they tend to fail gradually. It is not surprising when a building has leaks and cracks, but very surprising when it collapses. Pre-computer security models generally failed a little bit, too. If the law says you need a court order for a wiretap, and someone lies to a judge or sends the phone company a forged order, that lets them tap one line, not the entire phone system.
But in software, catastrophic failure is normal. Software security breaches don’t just disclose one or two account credentials; they leak every user’s credentials. They don’t give the attackers access to one customer’s network; they get into every customer’s network.
Cryptographic software has the same problems as any other software. Decades of effort have told us that cryptographic software can fail and if it fails, it is likely to fail catastrophically.
This is where the talking past each other happens. Law enforcement people who want back doors or lawful access or whatever it’s called these days, have a wiretap mental model. There are rules to control who gets to use the back door. They will mostly work, and the costs when they don’t are contained. So it’s a small decrease in security, a reasonable tradeoff to fight all that crime.
We, software people, have the catastrophe model. If you build a back door into your device, the system will always be one disaffected clerk or one misconfigured server away from hostile private and state actors being able to open that back door anywhere, any time, a catastrophic failure. Personally, I think that’s a much more likely scenario.
It’s not like we haven’t tried to explain this, but the people who believe in the wiretap model believe in it very strongly, leading them to tell us to nerd harder until we make it work their way, which of course we cannot.
I don’t see any way out of this impasse, which does not mean I am ignoring or minimizing the issues that law enforcement is trying to deal with. But compromise with catastrophe just doesn’t exist.
Sponsored byVerisign
Sponsored byDNIB.com
Sponsored byCSC
Sponsored byIPv4.Global
Sponsored byWhoisXML API
Sponsored byVerisign
Sponsored byRadix