Apple Unveils ‘Controversial’ Plan to Scan iPhones for Child Abuse Images in iOS 15 | 6+ Things to Know
Toggle Dark Mode
Starting later this year, Apple will begin proactively scanning photos on users’ iPhones, iPads, and Macs to detect and flag collections of Child Sexual Abuse Material (CSAM) in users photo libraries, along with other new features that will be coming in iOS 15 to help protect children online.
Apple officially announced the initiative under the heading of “Expanded Protections for Children,” and while it’s an extremely noble goal on the company’s part, it’s also become somewhat controversial, raising concerns from privacy advocates.
There’s actually a trio of new child safety features that Apple will be unveiling later this year in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, including a new feature that will warn children when sending or receiving sexually explicit photos in Messages, detecting CSAM in iCloud Photos, and adding guidance in Siri and Search to help with potentially unsafe situations.
Of these three, however, it’s the CSAM detection that’s raising alarm bells in some corners, since privacy advocates fear it could be the beginning of a slippery slope.
Detecting CSAM – Controversial?
To be clear, nobody is arguing that Apple and other big tech companies shouldn’t do more to fight the spread of Child Sexual Abuse Material. The debate is about how far they should be allowed to go in doing so, and whether Apple is opening up a Pandora’s box that authoritarian regimes could use for other purposes in the future.
For instance, security researchers such as Johns Hopkins cryptography expert Matthew Green have questioned the possibility of false positives causing innocent users to be wrongfully accused, along with the possibility that Apple could eventually be compelled by law enforcement to scan for more than just CSAM.
Meanwhile, privacy advocate Edward Snowden has weighed in with an even more alarmist take on the long-term privacy implications, with suggestions of “secret blacklists” of photos that would be used to turn everyone’s device into an “iNarc.”
Some of these concerns are probably at least partially why Apple is only rolling out this feature in the U.S. for now, since it’s working specifically with the U.S. National Center for Missing and Exploited Children (NCMEC). It would need to establish similar partnerships with law enforcement organizations in other countries, and we’re certainly hoping it will be somewhat choosy when it comes to partnering with governments that have dubious agendas and track records when it comes to human rights abuses.
However, it’s also ultimately worth keeping in mind that this is Apple we’re talking about — a company that’s already shown a willingness to pour tons of resources into building systems that are as private and secure as possible.
In fact, at a basic level, Apple isn’t doing anything new here at all — it’s simply doing it with far more privacy and security than anybody ever has before. For instance, as journalist Charles Arthur points out, Google has been doing this in the cloud since 2008, and Facebook started doing the same in 2011.
Apple has also reportedly been doing this for a while on the back-end too, as the company’s Chief Privacy Officer, Jane Horvath, told a panel at CES in early 2020.
In other words, what’s new here isn’t the fact that Apple is scanning users’ iCloud Photo libraries for CSAM, but rather that it’s going to move this scanning directly onto users’ devices in iOS 15. This is actually a good thing. Here’s why.
Update: It appears that the comments Jane Horvath made during the Chief Privacy Officer Roundtable at CES 2020 were misconstrued. Horvath was asked about whether content uploaded to iCloud should be screened for CSAM, but she responded rather obliquely by saying Apple was “utilizing some technologies to help screen for child sexual abuse material.” However, Apple recently clarified to Ben Lovejoy at 9to5Mac that this was in reference to scanning iCloud Mail attachments, which have always been completely unencrypted to begin with — even “at rest” on Apple’s servers. However, since iCloud Photos do not use end-to-end encryption at this point, it remains possible for Apple to scan these server-side — the company simply hasn’t chosen to do so.
How It Works
Firstly, it’s important to understand how Apple plans to put all of this together, as it’s designed to be far more secure and private than anything that it’s ever done with your iCloud photos before.
Firstly, in the Technical Summary of the CSAM Detection feature, Apple explains that its CSAM detection system won’t be looking at photos on the back-end at all anymore. Instead, it’s going to scan photos on your devices before they’re uploaded to iCloud Photo Library.
If anything, this sounds very much like Apple is getting ready to introduce end-to-end encryption for users’ iCloud Photo Libraries, which would end up being a much bigger win for privacy.
As things stand now, there’s no reason for Apple to push out this level of scanning onto the iPhone, iPad, or Mac. Apple has confirmed that CSAM Detection is only enabled when iCloud Photo Library is turned on, so what’s the point of building code to scan photos on devices before they’re uploaded when Apple can just sift through them at its leisure directly in the cloud?
The answer pretty much has to be encryption. If Apple enabled end-to-end encryption for iCloud Photo Library, it will lose the ability to scan for CSAM directly in the cloud. Apple has already been walking a tightrope for years between user privacy and the demands of law enforcement, and it’s safe to say that the US Justice Department would not look too fondly if Apple were to suddenly turn exabytes of users’ photos into a black box that they couldn’t peer into.
As scary as the concept of scanning your photos on your iPhone may be, consider that right now everything stored in your iCloud Photo Library is already completely wide open to inspection by Apple or law enforcement. It’s only Apple’s internal policies that prevent the FBI from going on a fishing expedition through your photo library whenever it feels like it.
Scanning for CSAM directly on your device will allow Apple to eventually lock down your photos in the cloud without creating a safe haven for child abusers to store their content online with impunity.
To be clear, the entire system is also built with encryption and multiple checks and balances in place. For example:
- Photos are only matched against known CSAM images from the NCMEC database. Apple is not using machine learning to “figure out” whether an image contains CSAM, and a cute photo or video of your toddler running around is not going to get flagged by this algorithm.
- False positives are theoretically possible, but rare. Due to the way the photos are compared — they’re reduced to a numerical code, or “hashed” — it’s conceivably possible that two entirely unrelated photos could result in a false match. These are called “collisions,” in cryptographic terms, where a completely innocuous photo coincidentally matches the hash of a known image in the CSAM database.
- A minimum threshold of matches is required before Apple can look at flagged photos. Because of the possibility of collisions, Apple doesn’t even get notified when the matched images remain below a certain threshold. Instead, using a technology called “threshold secret sharing,” the photo is flagged with an encrypted “safety voucher” that’s designed to be cryptographically unreadable until the threshold is reached. This means Apple couldn’t find these flagged photos even if it tried. The vouchers are securely stored, so the photos can be identified when and if the user’s account does hit critical mass.
- Apple says there’s a 1 in 1 trillion probability of an account being incorrectly flagged. This is done by setting the threshold high enough that only accounts with a significant number of CSAM images would even come to Apple’s attention in the first place. Accounts below the threshold remain completely invisible to the system.
- Flagged accounts get manually reviewed by a human. Once an account crosses the threshold, Apple says it will follow a manual review process, confirming that the content is actually CSAM. If that’s determined to be the case, then the account will be disabled and the user will be reported to the NCMEC. There will also be an appeal process for users to have their accounts reinstated if they feel that they’ve been mistakenly flagged.
- Only photos flagged as CSAM are disclosed. Once iCloud Photo Library is end-to-end encrypted, Apple won’t be able to view any of your photos at all on a routine basis. However, even if an account exceeds the CSAM matching threshold and gets flagged for further investigation by Apple, only those photos that were flagged will be viewable by Apple staff or the NCMEC. The rest of the user’s iCloud Photo Library remains safely encrypted. This will not be an invitation for law enforcement to take a joyride through your entire photo library.
One downside to this whole system, however, is that users won’t be given any insight into what’s going on. Apple notes that users can’t access or view the database of known CSAM images, for obvious reasons, but they also won’t be told if the system flags one of their images as CSAM.
For those who are interested in the nitty-gritty details, Apple’s CSAM Detection Technical Summary goes into much greater detail, and is definitely worth a read. As usual, the effort Apple has put into building this in a way that’s both private and secure is seriously impressive.
While that may not be enough to quell the concerns of security and privacy advocates, who fear the potential for the technology to be misused, it may turn out to be a necessary evil — a tradeoff between providing better overall security by much better encryption for the trillions of harmless photos that are already stored in people’s iCloud Photo Libraries and ensuring that those who would exploit children through the creation and sharing of harmful and abusive images can still be held responsible for their actions.
Apple says that CSAM Detection “will be included in an upcoming release of iOS 15 and iPadOS 15,” meaning it won’t necessarily be there when iOS 15.0 launches next month. It will also only apply to users who have iCloud Photo Library enabled on their devices, since presumably Apple has a legal responsibility to ensure that it’s not storing CSAM content on its own servers, but realistically understands that it’s none of its business what users keep in their own personal storage.