|
Just over one week ago, the New York Times published a major investigation into the intractable problem of illegal sexual images of minors being exchanged online. Despite flaws in the story and its companion pieces, the main take-away that Internet companies have failed to adequately address the problem has resonated widely.
Prostasia Foundation too has been critical of some of the Internet platforms called out in the article. But at the same time, we need to be realistic about how much responsibility we can (or should) place on tech firms to solve this problem. In a previous newsletter, we wrote that, “the large platforms have already done most of what can be done to prevent the sharing of known illegal images, by ensuring that images are scanned against a database of illegal content before they can be uploaded or shared.”
Does this mean that they couldn’t do more to prevent online child sexual abuse? Certainly, they could. They could implant spying tech into your web browser so that it checks every image that you load and every website that you visit. The microphones in your home devices could become always-on bugs that listen for sounds from child exploitation videos. A back door could be added to encrypted messaging apps, opening up your communications to government surveillance.
The New York Times investigation has pushed the last of these ideas back into the political spotlight. Last Thursday, law enforcement and national security officials from the United States, the United Kingdom, and Australia wrote a letter to Facebook warning it to hold off on its plans to add strong encryption to its Facebook Messenger app. The following day, Deputy Attorney General Jeffrey Rosen reiterated those demands at a summit at FBI headquarters.
The relevant question here isn’t whether tech companies could do more to intercept child sexual abusers; of course they could. The question is whether they should. If there is any use case that could justify such intrusive surveillance, the fight against child sexual abuse is it. But there are limits to what we allow governments and private companies to do, even in the pursuit of an important objective such as investigating crime. Human rights law sets those limits, and the right to communicate privately is one of them.
That’s why dozens of civil society groups, including Prostasia Foundation (the only child protection organization among them) pushed back against the governments’ demands to Facebook in an open letter that we released last week, stating, “default end-to-end security will provide a substantial boon to worldwide communications freedom, to public safety, and to democratic values, and we urge you to proceed with your plans to encrypt messaging through Facebook products and services.”
It’s easy to see why this isn’t an entirely satisfactory answer for some, because it seems to suggest that we should just give up in the face of the horrible crime of online child sexual abuse. But that’s not true at all; it simply means that we need to find other, better methods of addressing the problem. For example, rather than attempting to outlaw strong encryption, perhaps we could actually leverage encryption to promote child protection, as part of a broader primary prevention approach.
That’s the basis for Prostasia Foundation’s concept for a project that would utilize the strong encryption and anonymity that underpins the Tor network to provide information and support resources to those who are at risk of offending. As we describe in our concept note, this project “will demonstrate our rejection of the narrative that the strong encryption technologies that enable privacy and anonymity online are incompatible with child protection.”
The difference between an approach prioritizing the detection and prosecution of offenders, and our prevention-focused approach, is the difference between viewing child sexual abuse primarily as a crime, or viewing it primarily as a public health issue. Many of the circumstances in which minors suffer sexual harm don’t fit well within a criminal law frame—for example, a majority of new illegal images of minors are selfies, and about a third of perpetrators are minors themselves. That’s why framing child sexual abuse as a preventable public health problem enjoys increasing support among experts. Confoundingly, however, the funding dedicated towards prevention initiatives is a tiny fraction of the amount dedicated to the carceral approach. This has to change.
If we really want to prevent people from accessing images and videos of child sexual abuse, we need to get over the idea that controlling the channels by which those images are exchanged is a viable solution to the problem. We aren’t going to be able to stuff the encryption genie back in its bottle. Hanging our hopes on that, and forcing tech platforms to cripple their products, takes the heat off our own responsibility to be part of a broader culture of the primary prevention of abuse.
Sponsored byCSC
Sponsored byDNIB.com
Sponsored byVerisign
Sponsored byIPv4.Global
Sponsored byRadix
Sponsored byVerisign
Sponsored byWhoisXML API