|
What appears to be a leaked copy of the Burr-Feinstein on encryption back doors. Crypto issues aside—I and my co-authors have written on those before—this bill has many other disturbing features. (Note: I’ve heard a rumor that this is an old version. If so, I’ll update this post as necessary when something is actually introduced.)
One of the more amazing oddities is that the bill’s definition of “communications” (page 6, line 10) includes “oral communication”, as defined in 18 USC 2510. Now, that section says that “oral communication”
means any oral communication uttered by a person exhibiting an expectation that such communication is not subject to interception under circumstances justifying such expectation, but such term does not include any electronic communication;
Leaving aside the recursion in that definition, it states that oral communications are just that, oral—how could they be encrypted?
“Covered entities”—more below on what that means—have to provide an “intelligible” version (8:5) of information or data that has been “encrypted, enciphered, encoded, modulated, or obfuscated”. We’ll ignore the nit that “enciphered” is a subset of “encrypted”. “Encoded” is more probematic; it might be a form of encryption, but it could also refer to a standard for representing information, e.g., ASCII: the American Standard Code for Information Interchange. It’s tempting to think that the encryption meaning is the intended one, but look at the next word: “modulated”. Modulation has nothing to do with secrecy. If law enforcement can’t cope with a modulation technique, it points to a lack of technical capability, not an attempt by a vendor to conceal information. It also leaves me wondering what “encoded” means.
An “intelligible version” is supposed to be the “original form” (8:14) of the information or data. What does that mean, especially if we’re talking about an encoding format that law enforcement doesn’t understand? Let’s consider, say, the Lytro camera. Lytros use cool technology that lets you do things like change the focus after you’ve taken the picture. You can certainly get a JPG out of a Lytros image—but which one? Focus and depth of field matter. And there is no “original form” save for the actual three-dimensional objects in the field of view of the camera. It’s also worth noting that the JPG format—a way to encode an image—is a lossy algorithm. That is, there is by design no way to go back to the “original”. Should JPG be outlawed?
There’s also language to vastly expand the amount of metadata that has to be available to law enforcement. The bill speaks of “communication identifying information” (4:19) as something that has to be made available. It sounds like classic metadata, but the definition is now expanded. It generally speaks of “dialing, routing, addressing, signaling, switching, processing, transmitting ... information”, (4:21) while the older definition speaks just of “dialing, routing, addressing, or signaling”. The new definition includes “public and local source and destination addressing” (5:8), “port numbers” (5:15), and, I believe, MAC addresses (5:17): “addresses or other information that uniquely identifies the equipment used…”. The older terms were not defined; the new ones are worse. “Processing”? What does that mean? This section opens the door to unconstitutional overcollection. The WiFi router in your home is almost certainly covered by this bill (it’s a “device” used to “facilitate a communication ... of data”), but when some of that information reaches your router there are no third parties involved. That makes it legally unavailable to law enforcement unless they have a wiretap warrant—but this bill requires that the information be available under “any order or warrant issued by a court of competent jurisdiction” (emphasis added). A court order sufficient to obtain metadata is not a warrant.
A “covered entity” (6:18) is more or less anyone: a software vendor, a hardware vendor, a provider of “remote” or “electronic” communication services, and more. At least as important, a “license distributor” (4:10)—the language isn’t completely clear, but it seems to refer to app store operators—must ensure that anything distributed via their store (and it includes not just apps but also “services”) conforms to these requirements. That’s right: even if an app store does not already do vetting, it would be obligated to at least mandate the crypto back door. One wonders what that means for, say, GitHub—even open source software is generally distributed pursuant to a license that is included with the software.
There’s more; I could go on. For example, Orin Kerr noted that the bill “doesn’t require only reasonable assistance: It’s ‘assistance as is necessary’ to decrypt”. The application to home routers is incredibly invasive, since it requires features that such boxes have never had and either remote access by the manufacturer or a serious leak of private information. And all that is on top of the generally bad concept of crypto back doors.
This is a really bad bill.
Sponsored byIPv4.Global
Sponsored byVerisign
Sponsored byWhoisXML API
Sponsored byVerisign
Sponsored byDNIB.com
Sponsored byCSC
Sponsored byRadix
Steve, while it’s possible that the inclusion of “oral communication” is an unthinking oversight, it’s also possible they mean for this vile mess to apply to various forms of “code talking”. The use of code words or obscure languages may not meet the modern definition of encryption, but the desire to be able to demand that someone produce the code book they’ve been using makes at least some sense.