Last month Apple revealed that, in a nutshell, it wants to begin scanning users’ iCloud photo libraries to root out child sexual abuse material (CSAM). Noble intentions or not, the internet was none-too-pleased with the idea of Big Tech poking its nose into people’s personal photos, with some referring to the proposed feature as a thinly-veiled surveillance system.
In response to the enormous backlash it’s received, Apple has decided to delay the rollout of this contentious feature while it works out how to make it slightly more appealing.
Apple pumps the brakes
Now, for the sake of clarity, it’s not quite as simple as Apple thumbing through all of your photos like a five-year-old touching exhibit at a museum. Rather, the Californian tech giant wants to scan your images using, “…a cryptographic technology called private set intersection, which determines if there is a match [with existing CSAM] without revealing the result.” Users can also opt out of the entire process, and as much as that might be a half-hearted consolation to the privacy concerned, it kind of undermines the entire affair in the process.
Despite Apple’s placating, cyber-security experts have criticised the company for proposing a surveillance system in sheep’s clothing, even if its intended use is admirable.
Apple told The Verge, “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material.”
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
It goes without saying that using your platform to root out CSAM and its spreaders/owners is a mission worth pursuing. But it’s clear that this particular feature needs some (or a lot) more thought and tweaking before users will be willing to support it.