Apple Needs to Abandon Its Image Scanning Tech

More than 90 organizations around the world that help protect Human, Civil, and Digital Rights published an open letter last week to Apple calling them to change their minds on their plans to scan devices for child sexual abuse material (CSAM).

Apple Needs to Abandon Its Image Scanning Tech
Photo by Michał Kubalczyk / Unsplash

More than 90 organizations around the world that help protect Human, Civil, and Digital Rights published an open letter last week to Apple calling them to change their minds on their plans to scan devices for child sexual abuse material (CSAM).

The group says that, despite being it's good to protect children and limit the amount of CSAM, the proposal could also be used to censor speech and threaten the privacy and security of people around the world.

The organizations also added:

The breadth of the international coalition joining the letter demonstrates the extent to which Apple’s plans open the door to threats to human rights across the globe.

Apple announced the features earlier this month include the detection of CSAM-related search queries on Siri and Search, where users will be directed to information about why this search is harmful and how they can get help.

The more controversial part is that the plans also include using machine learning to detect if explicit images are being sent or received by users registered on a family account as being under the age of 13. The child will receive a notification letting them know that this form of material can be harmful and that their parents will be alerted if they choose to view or send the image.

The open letter published today noted that algorithms designed to detect sexually explicit material are notoriously unreliable and can mistakenly flag art, health information, educational resources and other imagery. It added that the corresponding parental notification feature is open to abuse.

The biggest part of Apple’s plan that is raising the most concern is a lot more complex. The plan uses cryptographic principles, users’ content uploaded to iCloud will be checked against existing CSAM in the US National Centre for Missing and Exploited Children database. If the system finds that CSAM is being collected in a user’s iCloud, it plans to automatically alert authorities.

The open letter said that this could create a precedent of adding image-scanning backdoors to Apple’s systems, and governments could compel Apple to extend notification to other accounts for purposes other than child protection.

Just this last week, Apple claims it would refuse any such demands from governments to add non-CSAM images to its new image scanning process.

The letter raised several concerns Apple and and other big tech giants will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images that a government finds objectionable to them.

The Open Letter also added.

Those images may be of human rights abuses, political protests … or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis.

Despite claims from Apple that they will always refuse government pressure to aid in repression, The Citizen Lab reported this week that the company already bans many political phrases from iPhone engravings in China, Taiwan and Hong Kong.

The letter concluded that while the proliferation of CSAM is undoubtedly a serious issue and that efforts to protect children are laudable, Apple’s plans put children and its other users at risk, both now and in the future. The coalition asked the company to abandon the plans and work with civil society groups in future on issues of privacy.

This is the latest in a slate of criticism the tech giant has received for the plans, including from its own employees. Rules for detecting child abuse content online passed by the European Parliament in July drew similar responses.