This week Apple announced that they were introducing a handful of new features to iOS and iCloud which were targeted at detecting child pornography in iMessage and Photos. Naturally a cacophony of young, childless, males on Hacker News erupted to claim that all idea privacy was now moot, that end-to-end encryption was now nonsense.
Articles discussing the inherent difficultly of hashing photos to detect explicit content cropped up. The EFF published a solid review of the proposed changes and decried the dissolution of privacy for people all around the globe. Debate is good. Critical eyes and thoughtful minds should absolutely scrutinise the work of private companies – especially when they have such a dominant role in the global market.
However I can’t help but feel like the response is fabulously misguided. As usual with discussions on social media the vast majority of people furiously commenting haven’t actually looked at the feature. Take a look:
For a large portion of the population the gut-wrenching world of child exploitation is mercifully abstract and, in most circles, brushed under the carpet. A dark and unknown enemy. But for the children who are subject to abuse (many multiples of magnitude more than the number you have in your head) and those that work to protect them – the people who have to review reports of such imagery, those that chase down the sprawling networks of trafficking and hunt those responsible for some of the most despicable crimes that one human could commit against another – the reality numbs mind and body, reaching into the very deepest recesses of what you know as the human experience.
Digital networks have provided easy access for predators to find and connect with vulnerable children. And let’s be clear. Every child is vulnerable when they have access to the Internet.
We teach our children to trust adults from a very young age – their naïvety is the source of their vulnerability. Every parent struggles to walk the line between innocence and susceptibility, talking in innuendo and trying to equip our kids with the tools to detect subversive behaviours – to spot the signs that something might be amiss. We also outright restrict access to certain materials or areas to try to prevent harm.
But predators are becoming more sophisticated every day. And I welcome the lengths that Apple is going to in order to increase protections for children, rather than simply ignoring the problem, or pretending that it’s too complex to address.
Could the technology be abused? Absolutely. But if you’re an Apple product owner who believes that you have absolute freedom with your technology you’re more naïeve than the children this feature is trying to protect.