What Changed?
In 2016, the FBI asked Apple to create a backdoor that would allow access to an iPhone belonging to a perpetrator of the San Bernardino terrorist attack. Apple, famously, refused.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software— which does not exist today— would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
This open letter embodies a type of corporate courage that is rarely seen. Apple took a stand to meaningfully guard the security and privacy of its customers.
Recently, Apple announced changes that will be included in this fall’s OS updates aimed at combating the spread of Child Sexual Abuse Material (CSAM). I suggest reading the whole announcement, titled “Child Safety,” yourself. In the announcement, Apple outlines their three-pronged approach:
- Explicit image filtering in iMessage for child accounts
- Interventions in Siri and search for CSAM-related queries
- On-device image scanning for CSAM in iCloud Photos
It’s the third of these that has managed to stir up considerable controversy.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
Understandably, the addition of a system capable scanning and reporting the files on your device has made many people uneasy. I’m not against the goal of combating CSAM, and quite frankly assuming otherwise of anyone in this discussion would be in bad faith. But I am opposed to this measure, and I’m opposed to it for the same reason Apple refused the FBI years ago.
In the same way that it’s fundamentally impossible to create a backdoor that only works against the iPhones of terrorists, there’s no way to make a media scanner that can only be used to detect CSAM.
Further reading: Apple’s Mistake