|
The argument for end-to-end encryption is apparently heating up with the work moving forward on TLSv1.3 currently in progress in the IETF. The naysayers, however, are also out in force, arguing that end-to-end encryption is a net negative. What is the line of argument? According to a recent article in CircleID, it seems to be something like this:
The idea of end-to-end encryption is recast as a form of extremism, a radical idea that should not be supported by the network engineering community. Is end-to-end encryption really extremist? Is it really a threat to the social order?
Let me begin here: this is not just a technical issue. There are two opposing worldviews in play. Engineers don’t often study worldviews or philosophy, so these questions tend to get buried in a lot of heated rhetoric.
In the first, people are infinitely malleable, and will be or should be shaped by someone, with the government being the most logical choice, into a particular moral mold. In this view, the government must always have asymmetry; if any individual citizen, or any group of citizens, can stand against the government, then the government is under direct existential threat. By implication, if government is the founding order of a society, then society itself is at risk.
In the second, the government arises out of the moral order of the people themselves. In this view, the people have the right to subvert the government; this subversion is only a problem if the people are ethically or morally incompetent in a way that causes such undermining to destroy the society. However, the catch in this view is this: as the government grows out of the people, the undermining of the government in this situation is the least of your worries. For if the society is immoral, the government—being made up of people drawn from the society—will be immoral as a matter of course. To believe a moral government can be drawn from an immoral population is, in this view, the height of folly.
What we are doing in our modern culture is trying to have it both ways. We want the government to provide the primary ordering of our society, but we want the people to be sovereign in their rights, as well. Leaving aside the question of who is right, this worldview issue that cannot be solved on technical grounds. How do we want our society ordered? Do we want it grounded in individuals who have self-discipline and constraint, or in government power to control and care for individuals who do not have self-discipline and constraint? The question is truly just that stark.
Now, to the second point: what of the legal basis laid out in the CircleID article? The author points to a settlement around the 3G standard where one participant claimed their business was harmed because a location tracking software was not considered for the standard, primarily because the members of the standards body did not want to enable user tracking in this way. The company stated the members of the standards body acted in a way that was a conspiracy. Hence the actions of the standards body fell under anti-trust laws.
Since there was a settlement, there was no actual ruling, and I’m not a lawyer, but the issues seem different in the case of encryption technology than what was considered in the case pointed to above (TruePosition, Inc. v. LM Ericsson Telephone Co., No. 11-4574 E.D. Pa. Oct. 4, 2012). In the case of encryption technology, it seems, to me, that the case would need to be somewhat different. Assume someone uses a piece of software that implements an encryption standard in the commission of a crime. Turn the software into a car, and the argument would need to look something like this:
Since the car used for the crime depended on tires that were made by a particular company for general commercial use, which depended on the specifications set out by a standards body made up of a number of tire manufacturers in order to allow for interoperability between the various manufacturers in the market, the standards body is responsible for the crime.
I’m just not certain this would be a very compelling argument; you need to take the responsibility from the criminal to the manufacturer, and then from the manufacturer to the standards body. So you would need to prove that the manufacturer created the product primarily for use in a criminal enterprise, and then that the standards body created the standard primarily in order to allow the successful manufacture of (inter-operable) software designed for criminal use. This seems to be a far different line of reasoning than the one used in the case given above.
For the argument against end-to-end encryption to stand, two things must happen. First, we must decide that we want the kind of society where we are essentially wards of an all-knowing state. Second, we must build some sort of legal theory that transfers criminal liability from the criminal to the manufacturer, and to the standards body, the manufacturer participates in through the manufacturer. I am not certain how such a legal theory might work, or am quite certain the unintended consequences of such a theory would be disastrous in many ways we cannot now imagine.
Sponsored byWhoisXML API
Sponsored byCSC
Sponsored byRadix
Sponsored byVerisign
Sponsored byDNIB.com
Sponsored byIPv4.Global
Sponsored byVerisign
Part of the problem is something seen in your article but not commented upon: those who want limitations on encryption always start by discussing criminal actions of one sort or another, but then use that as the jumping-off point to discuss limiting encryption across the board. Left unstated is the idea that encryption is only used to hide criminal activity. It’s left unstated because it’s patently false. It’s usually expressed as “If you don’t have anything to hide, you don’t have anything to fear.”.
Nothing to hide? Well, what about your wife’s Christmas present, that expensive necklace she’s been wanting all year? There’s nothing illegal about buying it for her for Christmas, yet you still probably don’t want her to find out what you bought her before she opens her present Christmas morning. And you probably don’t want anybody else knowing either, because a) you don’t want them accidentally tipping her off and b) it’s none of their business what you got your wife for Christmas. The same thing applies to a lot of things: your credit-card charges, your checking-account register, email chains discussing software integrations that’re covered under non-disclosure agreements, conversations with your attorney about legal documents you’re having prepared, conversations with your doctor about medical conditions in your family, the list goes on forever. None of that’s anything you’d want to hide in the sense of hiding illegal activity, but all of it’s things you’d want to hide in the sense of not having it made public. And none of it’s stuff the people wanting restrictions on encryption want to discuss, because the moment the idea that people can want to hide something without it being illegal or wrong in anyway comes up their entire argument collapses. We haven’t even gotten to the other half of the equation, either: signatures. Digital signatures are based on encryption, and it’s perfectly reasonable to want them to be unforgeable for the same reasons we want the signature on any important document to be unforgeable.
And the encryption has to be strong because, as Mr. Rutkowski doesn’t appear willing to acknowledge (although he certainly seems to understand), encryption mathematically doesn’t have any middle ground. It’s either secure from any third party or it’s vulnerable to any third party, it can’t be made vulnerable to some specific third parties while still being secure against the rest. So, do the parties who want restrictions on encryption really seriously want to claim that everything discussed above doesn’t matter, that everyone should be willing to give all that up including allowing their signatures to be forged on legally-binding documents? This is something that needs to be brought up more often in the discussion of encryption IMHO.
Our legal systems are what they are. I provided an outline to the existing ecosystem of legal controls in my article. If you believe they should change, have at it. In the meantime, it is those systems that provide the means of making decisions on what is lawful and who bears the risks.
In addition, feel free also to reference some of the cogent operational concerns and liabilities outlined in TLS 1.3 Impact on Network-Based Security.
>Our legal systems are what they are. http://www.dumblaws.com No Snowball fights around buildings in Aspen Colorado: http://www.dumblaws.com/law/1323 Arcades may have no more than fours devices in Rocky Hill Connecticut: http://www.dumblaws.com/law/1323 No Conceal Carry Slingshots in Haines Alaska without a license: http://www.dumblaws.com/law/21 Alaska generally does NOT require a permit or license to conceal carry a handgun: http://dps.alaska.gov/Statewide/R-I/PermitsLicensing/ConcealedHandguns Now back to the subject at hand, is breakable encryption dumb? Or it is just 1984 double speak, "freedom is slavery” comrade … > If you believe they should change, have at it. Why should I expend such effort over stupidity? Why not let them enforce it and see how that works out for them …. Intentionally broken “Encryption”, sure I’ll use that ….. “Stupid is as stupid does” - Forrest Gump Just walk away folks: https://i.pinimg.com/736x/0c/e5/4c/0ce54cefaf5dcc0c684ce3538df57b41.jpg The more stupid government “authority” gets, the more it discredits itself, and the easier it becomes for people to walk away. Laws only work when you have to put a tiny minority in jail, there is never enough jail space to incarcerate most of the population.
The legal system has little to do with this. You still don't address the technical part and it's consequences when doing the legal analysis. If someone proposed a law that required the acceleration due to gravity on Earth's surface to be 16 ft/sec^2 and refused to address the issue that that simply isn't reality, would you take their arguments in favor of the law seriously? You also avoid addressing any of the issues I mentioned. Are you proposing that it's so critical to allow third-party decryption of traffic that it justifies making digital signatures on documents easily forgeable with all that entails as we move towards digital documents? Or that people can't write letters using terms only they and the intended recipient understand if that means law enforcement can't figure out what they're saying? Because that's precisely what a cipher is.
We still live in a world where legal systems decide the difficult issues concerning what is allowed and who bears the risks. Extensive legal controls apply and are available. I’ll provide a link to my treatise on Legal Controls on End-to-End Encryption Warfare when it is available. It is intended as a handbook for policy makers and the legal community. A course may also be available.
We live in a world where hundreds of countries each have different views and laws. Some countries/entities are trying to force their laws/beliefs onto the other countries, against their will. https://www.youtube.com/watch?v=e3I9HkS0mvM Time Index 8:15: "What we have achieved in this agreement is something unprecedented. We defined a problem. [...] And we defined the solution [...] And we created the instrument [...] which forces all countries to have laws of monopoly. And we created the institution that would insure these are are implemented." Time Index 09:06: "In shaping this treaty we were the patient, the diagnostician, and the physician, all in one." "Nothing new, under the sun.' - Solomon