It Turns Out Apple Wasn’t Scanning iCloud Photos for Child Abuse Material | What This Means and What’s Next

Man Using a Photos App on iPhone Credit: Hadrian / Shutterstock
Text Size
- +

Toggle Dark Mode

Controversy continues to rage on over Apple’s recent announcement of its plans to implement a new Child Sexual Abuse Material (CSAM) Detection system in iOS 15, a move that’s effectively put Apple on the defensive, trying to explain how the new system will actually work, and why it’s ultimately a win for privacy.

While many of Apple’s explanations have fallen on deaf ears, it turns out that this isn’t just a new way of scanning for CSAM in iCloud Photos on the company’s part.

Contrary to what we and many others believed, it’s actually going to be the first time that Apple has ever scanned iCloud Photo libraries for CSAM.

Since most tech companies that offer cloud-based photo storage and sharing have been scanning for CSAM for years — Google has been doing it since 2008, for example — many believed that Apple was probably doing something similar. After all, iCloud Photos are not end-to-end encrypted on Apple’s servers, so the company clearly can scan photos for CSAM and other content.

This seemed to have been confirmed when the company’s Chief Privacy Officer, Jane Horvath, spoke at the Chief Privacy Officer Roundtable at CES 2020, admitting that Apple was “utilizing some technologies to help screen for child sexual abuse material.” However, while Horvath was responding to a question from the moderator about whether content should be “screened when it’s uploaded to either iCloud or Dropbox or any other cloud services,” she never mentioned iCloud specifically.

Further, around the same time that Horvath made these comments, Apple had published a page on Our Commitment to Child Safety that also noted that the company used “image matching technology to help find and report child exploitation.”

Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.

At the time, nobody really thought twice about this, since it was something that everybody else was doing. However, it’s naturally something that’s come up again in light of the current controversy over moving the CSAM Detection onto customers’ devices — and why Apple would need to do this.

9to5Mac’s Ben Lovejoy decided to dig a bit deeper by reaching out to Apple and finding out exactly what Horvath meant, and it turns out Apple was talking only about iCloud Mail.

Apple has confirmed to me that it already scans iCloud Mail for CSAM, and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups.

Ben Lovejoy, 9to5Mac

Again, however, just to be clear, Apple has always had the capability to scan iCloud Photo Libraries on its servers for CSAM — or anything else really — but for whatever reason, Apple has chosen not to avail itself of these practices.

The most likely explanation for this is a combination of both technology and privacy limitations. In either case, Apple would have had to code a CSAM detection system from scratch, so it clearly decided it was better to direct its efforts to build these features into iOS and macOS.

As counterintuitive as this may sound, that’s actually still a win for privacy. Remember that, contrary to some of the alarmist fears that are going around, Apple’s new CSAM detection system is not scanning everything on your iPhone — it’s limited only to those photos that are in the process of being uploaded to iCloud, and it doesn’t even run if iCloud Photo Library is disabled.

Ultimately, this is a way of scanning what will end up on Apple’s servers, and flagging CSAM before it even arrives.

How Can This Be Good for Privacy?

Firstly, it’s our hope that this is the first step toward introducing full end-to-end encryption (E2EE) in iCloud Photos. If Apple were to do this, it would be a massive win for privacy, since iCloud would instantly become the most secure and private mainstream cloud photo storage service on the planet.

Consider that, as of now, all the photos that you upload to iCloud Photo Library are technically viewable by Apple. It’s only Apple’s commitment to privacy and its internal policies that preclude Apple staffers from taking a joyride through your photo library.

However, should Apple ever be served with a warrant by a law enforcement agency, it will be compelled to hand over your entire iCloud Photo Library, allowing law enforcement to fish through your entire photo collection and decide what it wants to take issue with.

On the other hand, much like we saw with the case of the San Bernardino shooter, no warrant can compel Apple to hand over that which it doesn’t have access to.

Even if Apple doesn’t eventually implement E2EE, however, it’s still better for our privacy that the scanning occurs on your iPhone. Security researchers can audit what’s happening on iOS, but they don’t have access to what Apple is running on its servers.

However, it’s been Apple’s stance for years that it doesn’t scan or analyze iCloud Photos on the back-end at all — so there’s nothing that needs to be audited.

When it added new face and object recognition capabilities to Photos back in 2016 with the release of iOS 10, it made a big deal of the privacy benefits of doing this kind of scanning on the A-series chips in your iPhone rather than on the back-end — and nobody disputed at the time that this was a really great thing for privacy.

In fact, this is another factor that makes the current controversy so overblown. While it’s certainly possible that the new CSAM Detection could be abused by changing the source database — something that Apple vociferously claims it will not allow to happen — the same could be said of the face and object recognition that’s been in all of our iPhones for the past five years.

For those who are afraid of abuses of these systems by oppressive regimes, what’s more dangerous? A system that only allows scanning of matches from a database of known photos, or an on-device algorithm that could be silently and secretly reporting photos based on the faces and objects found within them.

Your iPhone Photos app is already indexing everything in your photos, but according to Apple, that data never leaves your iPhone. Of course, we have to take Apple’s word for that, but that’s no different from taking Apple’s word that the CSAM Detection system won’t be abused.

There’s a good reason that the system is launching in the U.S. only, where Apple is working with a trusted organization that deals exclusively in CSAM — not the FBI or the Department of Justice, but the National Centre for Missing and Exploited Children (NCMEC).

Why Does Apple Need to Do This at All?

Many rightly feel that Apple has no business scanning anything on their iPhones at all, and some would even suggest that this extends to iCloud.

However, we believe that Apple does have a moral and ethical responsibility to prevent CSAM from touching its servers.

Based on the new algorithm, Apple rightly still very much feels that “what happens on your iPhone stays on your iPhone,” but as we pointed out last week, that changes when your data leaves your iPhone to go to iCloud.

What Does This Really Mean?

To be clear, Apple is not going to be scanning and reporting on content from your iPhone that isn’t being uploaded to iCloud Photos. Apple really does still consider your iPhone storage to be your own private domain.

Some have used the analogy that this is like Apple coming into your home and rummaging through your personal property – but the more accurate comparison would be a storage service picking up packages from your home and inspecting them before moving them into a storage locker on their property.

While physical storage services don’t care about things like CSAM, there are other dangerous goods that customers are prohibited from storing, and it’s within their rights to inspect what goes in to make sure it’s complying with the rules. You can store what you want in your own home, but when you choose to store it somewhere else, the person or company who owns that space gets some say in the matter.

In this case, Apple ultimately doesn’t want CSAM on its servers — and we can’t say we blame them. In fact, in a recent internal Apple iMessage thread shared by 9to5Mac, the company’s anti-fraud chief, Eric Friedman, said that Apple’s privacy policies had made it “the greatest platform for distributing child porn.”

The spotlight at Facebook etc. is all on trust and safety (fake accounts, etc). In privacy, they suck. Our priorities are the inverse. Which is why we are the greatest platform for distributing child porn, etc.

Eric Friedman, Apple

Friedman’s comments illustrate the very fine line that Apple has to walk when it comes to balancing user privacy and public safety, and to make matters worse, this goes far beyond a moral and ethical issue, since many U.S. lawmakers are far more interested in the needs of law enforcement than the “fundamental human right” of privacy.

For instance, two years ago, Sen. Lindsey Graham (R-S.C.) the chair of the Senate Judiciary Committee, threatened to “impose its will” on Apple if it didn’t find a way to offer a backdoor for law enforcement to access encrypted iPhones. Sen. Graham has repeatedly called the iPhone a “safe haven for criminals where they can plan their misdeeds” and cited “encrypted apps that child molesters use,” as an easy bogeyman to help rally public support for law enforcement’s position.

Rumour has it that Apple once considered encrypted iCloud Backups, but abandoned the idea after pressure from the FBI, and it’s not hard to see why. While the iPhone is securely encrypted — to the point of frustrating law enforcement — iCloud Backups provide an easy back door for the FBI and other LEOs to get what they need in criminal investigations. Closing that door would undoubtedly have the Senate Judiciary Committee and other lawmakers redoubling their efforts to outlaw iPhone encryption.

While public opinion is generally divided when it comes to the balance between privacy and giving law enforcement the tools to catch criminals, child abuse is one area in which people are considerably more united.

If Apple hopes to improve security for everyone else without raising the ire of lawmakers and getting their privacy initiatives legislated out of existence, there are areas like preventing CSAM where it’s going to have to make some necessary and important compromises.

Sponsored
Social Sharing