|
Last week during the ICANN meeting in Barcelona I attended a short presentation from the Internet Watch Foundation (IWF).
Their mission is pretty simple:
...eliminate child sexual abuse imagery online (source)
Fortunately, the presentation I was at did not include any of the actual material (which would have been illegal anyway) but even without seeing any of it the topic is one that I think most people find deeply disturbing.
You can dig into some of the data on their interactive annual report, which includes some truly disturbing numbers including the one that I find most disturbing:
2% involve children aged 2 or under.
One of the tools they’ve made available to 3rd parties is a system which recognises images based on a hash. Here’s a video they’ve created to explain how it works:
Several content networks are using this and similar technologies to help keep their platforms free of this kind of content.
If you come across CSAM (Child sexual abuse material) then you should report it to your local hotline. The Irish one is here, while the international organisation that co-ordinates hotlines is here.
Sponsored byCSC
Sponsored byRadix
Sponsored byVerisign
Sponsored byDNIB.com
Sponsored byWhoisXML API
Sponsored byVerisign
Sponsored byIPv4.Global