Site icon Stuff South Africa

WhatsApp head calls Apple’s plan to scan iPhones for images of child abuse a “surveillance system”

WhatsApp Apple surveillance

In case you somehow missed it, last week Apple detailed its plans to scan iPhones (and iCloud accounts) in the States for images of child abuse. WhatsApp head Will Cathcart followed the announcement with a Twitter thread calling Apple’s plans to implement ‘neuralMatch’ on its devices a “surveillance system”.

And, honestly, it’s not really hard to see his point. Cathcart made his comments after being asked whether WhatsApp would implement a similar system on its platform. (The answer to that question is ” No”, by the way.)

Point to WhatsApp 

It’s really not hard to see what Apple wants to do with its neuralMatch system and it’s an admirable aim. Limiting the spread of child pornography — or Child Sexual Abuse Material (CSAM) as it is now known — isn’t something that anyone wants to fault. But WhatsApps’ Will Cathcart points out that Apple’s initiative constitutes an “…Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.”

Apple might “…use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy” in its operating systems — specifically by scanning for, detecting and reporting known images of child abuse to the National Center for Missing and Exploited Children — but it doesn’t necessarily end there.

It could be just as possible to use such a system to scan for images of Winnie the Pooh or Gollum, or something that actually mattered to more than just a few fragile egos in a position of power, and report that to someone — that’s the issue that our friendly WhatApp head is pointing out. Just because it’s intended for a noble purpose doesn’t mean it’ll always be used that way.

Apple, for its part, points out that the system will only scan iCloud Photos images on upload using “…a cryptographic technology called private set intersection, which determines if there is a match [with existing CSAM] without revealing the result” and that uploads to this platform can be disabled — which might not be the argument they think it is.

Yes, this means that users concerned about privacy (and those who traffic in images and video of child abuse) can easily opt-out, but then what’s the point of a system that will only catch stupid or uninformed abusers? It’s the clever ones who are the problem.

Apple also points out that this system is an integral part of iOS, meaning that there’s no way to make it work regionally. So you won’t have this regional government or another checking up on its citizens and making sure they’re not hoarding other known images — unless the check is being done on all devices running iOS, we guess?

Both sides have their points here. Child abuse is obviously a terrible thing but there are implications beyond that for Apple’s system. In the wrong hands, the potential for censorship or dissident crackdown is devastating. But it’s perhaps not surprising that the guy in charge of WhatsApp is the one to complain. Apple and Facebook are currently arguing about user privacy, so it’s not unexpected that someone connected to the company has taken a shot at Apple for possibly compromising its users’ privacy.

Exit mobile version