UK Backs Down on Controversial iMessage Surveillance Bill (Sort Of)
Toggle Dark Mode
The UK government has put an indefinite hold on a new bill that would have forced Apple and other messaging service providers to create “back door” security vulnerabilities to allow law enforcement and intelligence agencies to monitor users’ activity.
In June, the UK Home Office opened a public consultation into proposed revisions to its Investigatory Powers Act (IPA) that it claimed were being put forward to “protect the public from criminals, child sex abusers and terrorists.” Among the changes, tech companies would be required to “install technology to scan for child-abuse material in encrypted messaging apps and other services.”
Companies would also need to provide advance notification to the Home Office of any changes to product security features before being cleared to release them to the general public. In other words, even the most minor iOS point releases would need to be screened and approved by the UK government before Apple would be allowed to make them available for download by customers in the UK.
Unsurprisingly, Apple vehemently opposed this motion, stating that it would shut down FaceTime and iMessage in the UK if the new surveillance bill were to become law. It wasn’t the only one, either; both Signal and WhatsApp threatened to pull out of the UK entirely if the act, which critics call a “snooper’s charter,” passes.
In a nine-page submission seen by BBC News, Apple unequivocally states that the government’s proposal “constitutes a serious and direct threat to data security and information.” Apple also takes umbrage at the notion that UK government policies should dictate the security of iPhone users globally since it would be impossible to do what the online safety bill is demanding without weakening security for every iPhone user worldwide by creating a back door into the end-to-end encryption used by Apple’s services.
Thankfully, these arguments haven’t fallen entirely on deaf ears. While the online safety bill still gives the UK government the powers to scan messaging apps, it’s conceded that the technology doesn’t exist to do this properly and safely at this time.
According to The Financial Times, junior arts and heritage minister Lord Stephen Parkinson made a statement to the House of Lords today to end the stand-off with tech companies by confirming that the UK tech regulator, Ofcom, will only require companies to scan their networks “when a technology was developed that was capable of doing so.”
A notice can only be issued where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content.Lord Stephen Parkinson, UK junior arts and heritage minister
The Financial Times adds that security experts believe that such technology is years away, which isn’t surprising as even the most sophisticated attempts to create these kinds of safety features have turned out to have hidden flaws.
For example, in 2021, Apple announced a controversial plan to begin scanning iCloud Photos for CSAM — Child Sexual Abuse Material. While Apple’s solution was extremely privacy-focused — and used some very clever and advanced technologies to ensure that it stayed that way — privacy advocates still opposed the plan, making the “slippery slope” argument that the same technology that could scan for CSAM could easily be abused in the future by authoritarian regimes to scan for “objectionable” images stored by protestors.
As a result, Apple quietly abandoned the initiative, and we heard nothing more about it until last week when Apple opened up about the reasoning behind its decision, which ultimately came down to the realization that even its well-thought-out system was ultimately a Pandora’s box that would create “new threat vectors” and “unintended consequences.”
Sadly, the definition of what’s technically feasible in this case ultimately rests with the UK government, which makes it clear that its position on the issue hasn’t changed. It’s also important to remember that the new bill still gives the government the power to order this kind of surveillance — they’re merely saying they won’t use those powers for the time being. However, it’s clear that they expect companies to develop these technologies eventually — and they reserve the right to force them to do so.
As has always been the case, as a last resort, on a case-by-case basis and only when stringent privacy safeguards have been met, [the legislation] will enable Ofcom to direct companies to either use, or make best efforts to develop or source, technology to identify and remove illegal child sexual abuse content — which we know can be developed.Statement from the UK Government
Child safety advocates have also been increasing pressure on tech companies and government agencies to develop the technology to scan and detect CSAM. The Financial Times cites Richard Collard, head of child safety online policy at the UK’s National Society for the Prevention of Cruelty to Children, who states that the “UK public overwhelming support measure to tackle child abuse in end-to-end encrypted environments” and that tech companies need to “show industry leadership by listening to the public and investing in technology that protects both the safety and privacy rights of all users.”
Meanwhile, Heat Initiative, the child safety group that prompted Apple’s recent explanation of why it killed its CSAM plans, has launched a very pointed campaign against Apple in an attempt to get it to resurrect its CSAM detection system, accusing Apple of deliberately allowing child sexual abuse to be stored on iCloud, and demanding that Apple “deliver on their commitment” to detect child sexual abuse images and videos.
In other words, while Apple and other messaging providers may have won this particular battle in the UK, the war is far from over, as they find themselves repeatedly caught between privacy advocates who consider any monitoring to be unacceptable and child safety advocates who believe that they’re not doing nearly enough to stem the flow of CSAM.