|
Last month INHOPE, a global trade association of child abuse reporting hotlines, rejected a joint call from Prostasia Foundation, the National Coalition Against Censorship, Article 19, and the Comic Book Legal Defense Fund, that its members should stop treating cartoons as if they were images of child sexual abuse. As our joint letter pointed out, INHOPE’s conflation of offensive artwork with actual abuse images has resulted in the misdirection of police resources against artists and fans—predominantly LGBTQ+ people and women—rather than towards the apprehension of those who abuse real children.
INHOPE is not a child protection organization, but an industry association for organizations and agencies that provide censorship services to government and private industry. Its Articles of Association are surprisingly explicit about this: its objective is to “facilitate and promote the work of INHOPE Member Hotlines, whose work is to eradicate illegal content, primarily child sexual abuse material, on the internet” [emphasis added].
It executes this mission by collecting personal information of those who share images that are reported to it (which can include a name, email address, phone number, and IP address), and sharing this information among its member hotlines and with police. Again, it is explicit about this, acknowledging that its “core business revolves around the exchange of sensitive data.” INHOPE members have actively lobbied to weaken European privacy rules so that they can maintain these data collection practices, while refusing to accept a compromise allowing continued scanning for actual child abuse images.
Such data collection is clearly justifiable when it is limited to actual sexual abuse images. But INHOPE’s data collection isn’t limited to this. It siphons up reports of all manner of reports that its members declare to be illegal in their country, and (with one exception mentioned below) gives them another “once-over” to determine whether they are illegal worldwide, only in the reporting or hosting country, or not at all, before forwarding them to INTERPOL. Even if this assessment leads to a determination that the images are lawful, INHOPE doesn’t delete them. Inexplicably, it instead classifies them as “Other Child-Related Content,” retains them in a database, and sends them to law enforcement for what it describes as “documentation purposes.”
Images reported by NCMEC, the American hotline, undergo even less vetting. Despite being an INHOPE member, NCMEC doesn’t utilize the services of INHOPE analysts, but directly shares reported images and associated personal information with law enforcement agencies around the world. According to Swiss authorities, up to 90% of these images are later found to be lawful.
INHOPE chose to mischaracterize our call as being grounded in a misunderstanding of the fact that some countries do prohibit artistic sexual representations of minors by law. But our letter explicitly acknowledged that fact, by calling on INHOPE to establish a policy for its members that “artistic images should not be added to image hash lists that INHOPE members maintain, and should not be reported to authorities, unless required by the law where the hotline operates” [emphasis added].
There are indeed some countries in which lawmakers do ill-advisedly use the same laws to criminalize the dissemination of offensive art as they use to prohibit the image-based abuse of real children. But the risks of an international organization allowing national authorities to act as gatekeepers of the images that it it treats as child abuse and reports to INTERPOL should be obvious.
For example, Canada’s overbroad child pornography laws have recently drawn public attention over the much-criticised prosecution of an author and publisher for a novel that includes a brief scene of child sexual abuse in its retelling of the story of Hansel and Gretel. The Canadian Center for Child Protection, one of only two INHOPE members that proactively searches for illegal material, was responsible for the arrest of a a 17 girl for posting artwork to her blog, when it reported her to authorities in Costa Rica where such artwork is also illegal.
In other countries where cartoon images are illegal, criminal laws are used to disproportionately target and criminalize LGBTQ+ people and women. An example given in our letter was the case of a Russian trans woman who was arrested over cartoon images and sentenced to imprisonment in a men’s prison.
Russia’s INHOPE member the Friendly Runet Foundation encourages people to report if they are “exasperated by the on-line materials transgressing morality,” and boasts that it was “created at the direct participation and works in close partnership with the Department “K” of the Russian ministry of Interior.” This terminology, and the hotline’s association with the ministry that criminalized “gay propaganda,” is understood by Russian citizens as an attack on LGBTQ+ people’s speech. It is noted that no LGBTQ+ representatives are included on INHOPE’s Advisory Board.
INHOPE can’t do anything, directly, about unjust national laws that conflate artistic images with child abuse. INHOPE and its members also can’t do much to prevent conservative members of the public from reporting non-actionable content (although one member has taken steps to address this problem). That’s why we are directly targeting the public with our “Don’t report it, block it” information campaign, to stem such false reports at the source.
But what INHOPE can do is to decide what to do with reports that it receives about artistic content. Passing them to law enforcement authorities, using a censorship and surveillance infrastructure that was established to deal with real images of child sexual abuse, isn’t its only option here. Neither is it necessary to place those who share such images in the crosshairs of police, especially in countries that have unjust laws or repressive governments.
In 2019, we held a seminar with Internet companies and experts to discuss more proportionate ways of dealing with content such as child nudity, child modeling, and artistic images, that doesn’t rise to the legal of child abuse, but which can still be triggering or offensive, or harmful when shared in the wrong context. Through a multi-stakeholder process, this resulted in the development of a set of principles for sexual content moderation and child protection that were launched at last year’s Internet Governance Forum.
INHOPE already has a Code of Practice that its members are required to comply with. To be clear, some INHOPE members already do have good practices, and Britain’s Internet Watch Foundation (IWF) is one of these: although cartoon images are unlawful in the United Kingdom and the IWF is mandated to accept reports about them, it doesn’t include these reports in its hash lists of abuse images, nor share them with foreign police. Our joint letter invited INHOPE to take the opportunity to amend its Code of Practice to apply similar standards to its other members. Its decision not to consider this doesn’t reflect well on the organization.
Internet reporting hotlines are selling a product to law enforcement authorities: a censorship service for which actual images of child abuse are only the selling point. This can be a lucrative gig; NCMEC alone received $33 million from the United States government in 2018. Therefore, as a business proposition, it makes sense for INHOPE and its members to ask few questions about the scope of the censorship services their governments call upon them to provide. Conversely, since almost no federal money is being allocated towards abuse prevention, there is little incentive for them to invest in prevention interventions that could reduce abuse in the long run.
But these perverse incentives are leading it down a dangerous path. It’s time for us to call this censorship cartel to account, and to demand that it consider the human rights of the innocent people who are being hurt by its approach. The plain fact is that INHOPE doesn’t represent the voices of experts who work on child sexual abuse prevention, it represents the law enforcement sector. By refusing to curtail its activities to place the censorship of artistic images outside its remit, INHOPE has lost the moral authority that provides the only justification for its sweeping and dangerous powers.
Sponsored byDNIB.com
Sponsored byWhoisXML API
Sponsored byIPv4.Global
Sponsored byRadix
Sponsored byVerisign
Sponsored byVerisign
Sponsored byCSC