West Virginia Sues Apple, Alleging iCloud is a ‘Safe Haven’ for CSAM
Toggle Dark Mode
West Virginia’s attorney general is suing Apple, accusing the Cupertino company of allowing its iCloud online storage service to become what one of its own executives once called “the greatest platform for distributing child porn.”
As reported by Reuters, Attorney General JB McCuskey (R), alleges that Apple is prioritizing user privacy over the safety of children. The case is believed to be the first of its kind by a government agency over the storage and distribution of child sexual abuse material (CSAM) on Apple’s cloud storage platform.
The lawsuit, filed in Mason County Circuit Court, seeks statutory and punitive damages and also asks that a judge force Apple to put in place safer product designs, including measures to detect child abuse material.
“These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,” McCuskey said in the statement.
Apple released a statement saying it has put measures in place to prevent children from sending or receiving nude images.
All of our industry-leading parental controls and features, like Communication Safety — which automatically intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls — are designed with the safety, security, and privacy of our users at their core.
On Thursday, Apple announced plans to rollout a feature in the near future allowing US users to flag inappropriate content directly to the iPhone maker via a “Report to Apple” button. The company says the introduction of the new feature is not in response to the lawsuit and had been planned before the filing.
Users in the United States and around the world are growing more and more concerned over the proliferation of harmful content on smartphones and social media. Groups are bringing pressure on companies like Apple, Google, Meta and others to deal with the problem.
The Encryption Deadlock: Privacy vs. Policing
The West Virginia lawsuit focuses on Apple’s end-to-end encryption, which puts the digital files stored on an iCloud user’s account outside the reach of law enforcement, Even Apple does not have the keys needed to decrypt a user’s digital files. The WV government says this technology has allowed CSAM to flourish on the iCloud platform.
This is just the latest in a longstanding battle between the government and privacy advocates over the use of end-to-end encryption. While privacy advocates say the technology is a valuable tool to prevent government eavesdropping, law enforcement says the bad actors of the world use the tech to hinder criminal investigations.
In 2021, Apple announced a rather ingenious plan for a system that would scan photos to identify CSAM before uploading them to the cloud. However, it first delayed and ultimately dropped those plans after opposition from privacy advocates, who feared the possibility that governments could abuse the ability to censor content or arrest users for images unrelated to CSAM. While that decision placated those who feared authoritarian overreach, Apple was subsequently hit from the other side by child safety advocates insisting the company wasn’t doing enough — with a $1.2 billion lawsuit to help make their point.
‘Chosen to Not Know’: The Internal Fallout
In its court filing, the West Virginia Attorney General’s office cited a 2020 iMessage conversation between Eric Friedman, who was then serving as Apple’s anti-fraud chief and security chief Herve Sibert, in which Friedman “the greatest platform for distributing child porn.”
The spotlight at Facebook etc. is all on trust and safety (fake accounts, etc). In privacy, they suck. Our priorities are the inverse. Which is why we are the greatest platform for distributing child porn, etc.
Eric Friedman, Apple anti-fraud chief, in a 2020 iMessage thread
This internal conversation was initially revealed as part of the discovery in the landmark Apple vs Epic Games case and shared by The Verge in August 2021. Taken in context, Friedman is referring to how Apple’s focus on privacy has prevented it from taking a more proactive approach against CSAM. In a follow-up comment, Friedman adds that Apple has “chosen to not know in enough places where we really cannot say” how much CSAM is actually passing through its systems — an admission that plays right into West Virginia’s case.
The Reporting Gap: Why Regulators Are Fuming
While Google and other tech companies offering cloud storage would check uploaded images or attachments against a database of known CSAM provided by the National Center for Missing and Exploited Children and other clearinghouses, Apple took a different approach. As the lawsuit points out, Apple reportedly made only 267 CSAM reports to the National Center for Missing and Exploited Children (NCMEC) in 2023 — a number that pales in comparison to Google’s 1.47 million reports and over 30 million from Meta.
That’s likely because Apple limited its CSAM scanning to iCloud Mail; it didn’t scan any of the photos or files users uploaded to their iCloud storage. It also didn’t offer end-to-end encryption for the data. This allowed law enforcement to access the saved files with a warrant.
While Apple planned end-to-end encryption for iCloud backups, which would have rendered much of this data inaccessible by law enforcement officials, it abandoned those plans after pressure from the FBI. Meanwhile, it faced threats from lawmakers insisting on a back door to bypass the on-device encryption of the iPhone itself, which has often frustrated law enforcement officials.
With its 2021 plans to implement a sophisticated “NeuralHash” CSAM detection system, it seemed like Apple was preparing to turn the key on end-to-end encryption in iCloud. NeuralHash was designed to scan images on users’ devices before they were uploaded, balancing detection of potential CSAM with user privacy. However, this would also eliminate any need to scan for this material in the cloud, leaving Apple free to securely encrypt everything without risking the ire of the Trump administration.
However, despite the company backing down on that plan in the face of criticism from privacy advocates like the Electronic Frontier Foundation, Apple went ahead and launched end-to-end encryption for nearly all iCloud data anyway in the form of an optional Advanced Data Protection feature.
When a user enables Advanced Data Protection, their trusted devices retain sole access to the encryption keys for the majority of their iCloud data, protecting it with end-to-end encryption. That’s raised user privacy to a whole new level, but it also hasn’t made it any friends with regulators like West Virginia’s AG, who argues that Apple’s refusal to scan iCloud data for CSAM in the first place has long allowed illegal content to flourish and abusers to evade law enforcement, harming the state’s public health and child protection systems.



