By the end of 2022, Apple canceled its initiative to detect child sexual abuse photos stored in iCloud. Since its announcement, the plan has generated great controversy because, despite its good intentions, it could lead to unimaginable privacy problems. And although those from Cupertino remained attached to their initial strategy despite criticism from the public and dozens of civil organizations, they ended up throwing everything overboard without giving further explanations. Until now.
For the first time, Apple has clarified why it has ended an initiative to scan photos stored in iCloud for child abuse material. erik neuenschwandervice president of user privacy and child safety at the Californian firm, detailed this in a letter sent to Sarah Gardner, CEO of the Heat Initiative, a child safety group (via WIRED).
Initially, Gardner sent an email to Tim Cook and other Apple executives expressing his dissatisfaction with the initial delay and subsequent cancellation of the plan to detect child abuse photos on their devices and cloud services. In addition, he informed them that his group would start a campaign to publicly pressure them to implement a system to detect, report and delete child pornography stored in iCloud. As well as to promote the creation of a mechanism that allows users of their products to report the existence of this type of material.
Neuenschwander’s response to the claim is very interesting. The executive openly acknowledged that Apple failed to elucidate the necessary technical aspects to develop a child abuse photo detection system that was effective, but did not compromise the privacy of users.
Apple consulted with technology, security and human rights experts, and studied the scanning technology “from virtually every angle possible.” And the conclusion was blunt: what was intended to be achieved was impossible without making concessions in terms of privacy.
Apple and the cancellation of its initiative to detect photos of child abuse in iCloud
In the email sent to the leader of the Heat Initiative, Neuenschwander points out that child pornography is abhorrent, and lists the many steps Apple has taken to protect minors using its devices. However, she then proceeds to detail why the initiative to detect this type of material in iCloud was canceled.
“Businesses routinely use cloud scanning of personal data to monetize their users’ information. While some companies have justified these practices, we have chosen a very different path: one that prioritizes the security and privacy of our users. We estimate that scanning iCloud content privately stored by each user would pose serious unintended consequences for them.(…) Scanning information privately stored in iCloud would create new threat vectors that data thieves could find and exploit .
It would also inject the possibility of a slippery slope of unintended consequences. Searching for one type of content, for example, would open the door to mass surveillance and could create a desire to scan other encrypted messaging systems for other categories and types of content (such as images, video, text, or audio). How can users be sure that a tool for one type of surveillance has not been reconfigured to monitor other content such as political activity or religious persecution? Mass surveillance tools have widespread negative implications for freedom of expression and, by extension, for democracy as a whole. Also, designing this technology for one government might require applications for other countries with new types of data.
Scanning systems are not foolproof either, and there is documented evidence from other platforms where innocent people have been drawn into dystopian networks that have victimized them when they have done nothing more than share perfectly normal and appropriate photos of their babies.”
Excerpt from the e-mail sent by Erik Neuenschwander to Sarah Gardner.
It’s not often that Apple talks openly about decisions about its product or service strategies, but it’s clear that its initiative to detect child abuse photos in iCloud deserved it. It is also likely that Neuenschwander’s message was not intended for public release. Although it does at least provide a clearer picture of What did the apple people put in the balance when dealing with this story?.
One of Apple’s most controversial plans




Let’s remember that Apple initially stuck to its decision to scan devices like an iPhone for photos of child abuse. In fact, Craig Federighi, the company’s vice president of software engineering, acknowledged that the announcement had caused confusion. And he even tried to explain the approach the company was taking to preserve user privacy.
At the time, the executive mentioned that the system would be based on CSAM Hashes. What did this mean? That Apple would use a database of child sexual abuse photos created by child protection entities and independently audited. The technology to implement would sound an alarm only when a certain number of matches were detected.
“If, and only if, you reach a threshold of around 30 matching known child pornography photos, then Apple will know about your account and them. And at that point it will only know about those, and not about any of your other images. This is not an analysis of why you had a picture of your son in the bathtub. Or, for that matter, if you had a pornographic image of any other kind. This literally only matches the exact fingerprints of specific images of pornography. known children,” Federighi explained to The Wall Street Journal in August 2021.
Despite that ironclad initial position, Apple concluded that there was no way to guarantee privacy in search of child abuse photos. Not even if the information processing was done, for example, within an iPhone, and not on iCloud servers.
With Neuenschwander’s explanation, we now at least have more precise data on this story. Let’s not forget, however, that Apple has continued to work to improve the protection of minors who use their devices. One of the most notorious cases is that of nudity alerts for images sent via iMessage.