Toggle Dark Mode
Despite Apple’s strong stance on privacy, we’ve always known the company has to walk a pretty fine line when it comes to locking things down too tightly, especially where dealing with law enforcement agencies is concerned.
Apple has always made a point that everything stored on your iPhone is quite securely encrypted, to the point that it’s basically impossible to get at it without knowing your passcode.
Except for targeted malware attacks, the only way to get into an iPhone is to “brute-force” the password, which basically means hooking it up to a specialized device that tries every possible combination of numbers — and possibly letters — until it finds the correct one. However, using longer alphanumeric passwords makes this effectively impossible to do in somebody’s normal lifespan.
In fact, the iPhone is so secure that it’s put Apple at the centre of a pretty big controversy on whether big tech companies should be required to create a “back door” for government and law enforcement agencies.
U.S. lawmakers, in particular, have been taking steps to make end-to-end encryption illegal, raising the spectre of child exploitation as a bogeyman to justify their position the Apple should provide a “master key” for law enforcement to bypass encryption and easily perform a warranted search of any iPhone that comes under investigation. It’s no wonder Apple has been trying to get ahead of the curve and appease lawmakers by finding a middle ground.
After all, if Apple doesn’t tread carefully, it risks having all the privacy and security protections that it’s carefully built into iOS legislated out of existence by lawmakers under the guise of protecting kids.
This is undoubtedly also the thinking behind Apple’s new Communication Safety feature that’s coming in iOS 15.2, particularly since the entire iMessage platform is already tightly end-to-end encrypted, not just on each user’s iPhone, but also as it travels through Apple’s cloud servers.
In other words, barring any industrial-strength spyware on your device, when you send an iMessage to somebody, there’s no way for anybody to intercept or read that message apart from the intended recipient(s).
Unfortunately, as great as that sounds, there are a few other weak links in how the Messages app stores its data that could result in others getting access to your messages, and this is especially true for law enforcement agencies.
Apple has never made any secret that it will comply with any valid law enforcement request to provide whatever data it can, which generally includes everything in your iCloud Backup.
In fact, during a senate hearing two years ago, Apple’s head of user privacy, Erik Neuenschwander, shared that the company received 127,000 requests from law enforcement from 2012 to 2019, and in most cases, it responded to these within 20 minutes, usually by handing over all the pertinent data that’s stored on its servers.
To be clear, Apple still can’t open an iPhone. When senators accused Apple of blatantly refusing court orders to “open” an iPhone, Neuenschwander pointed out that no matter how much it may want to, Apple can’t do what is essentially impossible, which includes breaking the strong encryption it’s created for the iPhone.
Many lawmakers and politicians refuse to buy into this particular point, however, maintaining that Apple should be required to re-engineer its devices so that this becomes possible.
Fortunately for user privacy, those wishes have yet to become enshrined in law, so for now, agencies such as the FBI will need to be content with whatever Apple can provide.
An internal FBI document recently obtained and shared by Property of the People (via AppleInsider) outlines how iMessage stacks up against other secure messaging systems from the perspective of the FBI’s ability to legally access content and metadata from them. The document is unclassified but labeled as For Official Use Only (FOUO) and Law Enforcement Sensitive (LES).
While the document spells out what we already know, it’s an interesting inside look at where iMessage fits in alongside others such as Signal, Telegram, and WhatsApp.
How Secure Is iMessage?
In the case of iMessage, the key vulnerability is one that you should already be aware of, and it ultimately comes down to any data you’ve stored in your iCloud Backups.
Specifically, the document notes that the FBI can obtain “Limited” message content from iMessage. A subpoena “can render basic subscriber information,” and 25 days of iMessage lookups to and from a target number — although a footnote explains that Apple “includes a disclaimer that a log entry between parties does not indicate a conversation took place,” and that “these query logs have also contained errors.”
On the other hand, a search warrant “can render backups of a target device,” and “if target uses iCloud backup, the encryption keys should also be provided with content return” — that is, as part of the backup — along with iMessages if “target has enabled Messages in iCloud.”
In layman’s terms, this means that if you’re using iCloud Backups, any Messages data from your iPhone is vulnerable to a search warrant — or any hacker who gets access to your iCloud account. This can occur in two different ways:
- If you’re using Messages in iCloud, your messaging data is stored using end-to-end encryption – however, the key used to decrypt those messages is stored in your iCloud Backup.
- If you’re not using Messages in iCloud, your messaging data is stored directly in your iCloud Backup — unencrypted.
In other words, if you’re not using Messages in iCloud, then your messages are stored in your iCloud Backup in readable form.
If you are using Messages in iCloud, the key to decrypt them is stored in your iCloud Backup.
Either way, if you’re using iCloud Backups, your iMessage history is vulnerable.
Fortunately, you can disable iCloud Backups and backup your iPhone or iPad directly to your computer instead. In this case, your Messages data is safe, since even if you’re using Messages in iCloud, this data will be stored using end-to-end encryption, with the key nowhere to be found on Apple’s servers.
Of course, if you’re not using Message in the Cloud, your messaging history won’t be on Apple’s servers at all — it will only be stored locally on your device and in your computer backups.
Note that even in this case, your actual iMessage conversations travel through Apple’s servers, and SMS conversations travel through your carrier’s network. While Apple can’t provide the content of your messages, it may still be able to provide a log of who you’ve been communicating with.
Note that SMS text messages aren’t even that secure, and there’s a good chance that your carrier can intercept everything going on through those channels.
Just keep in mind that all bets are off if you’re using a company-provided iPhone, as there are numerous management tools that a corporate IT department can install to monitor your activity. In many jurisdictions, however, all communications that occur on company-owned hardware belong to the company, so you shouldn’t have an expectation of privacy in those cases anyway.
Other Messaging Platforms
The FBI document also provided details on what can be obtained from several other popular messaging systems, and many of these came out ahead of Apple’s iMessage.
For example, Signal, Telegram, Threema, Viber, WeChat, and Wickr were all listed as providing “No Message Content.” Line and WhatsApp provided “Limited” content, but only in specific cases.
WhatsApp’s users are vulnerable to the same loophole as iMessage users, with the FBI noting that “If target is using an iPhone and iCloud Backups enabled, iCloud returns may contain WhatsApp data, to include message content.”
Line, on the other hand, can maintain seven days worth of specified users’ text chats in the face of an effective warrant, but this is only possible when the user has not enabled end-to-end encryption.
Among the listed messaging apps, Signal was unsurprisingly the most private of the bunch, with the ability to provide only the date and time that a user registered for the service, and the last time they connected to it.
Telegram came in a close second, with a note that it may disclose IP addresses and phone numbers to relevant authorities “for confirmed terrorist investigations,” but it does so solely at its own discretion.
Lastly, WeChat may be a special case. While the FBI notes that it can’t get any message content out of the China-based chat service, that’s probably not the case for Chinese authorities. In fact, the FBI notes that WeChat “cannot provide records for accounts created in China,” but will provide “basic information” such as name, phone number, email, and IP address for “non-China accounts.”
The same could be said for other messaging platforms owned by foreign companies, which might not be compelled to respond to U.S. law enforcement agencies, but could be required to do so for court orders from their own governments.
In most cases, these other messaging platforms maintain their security by avoiding iCloud Backups entirely. Developers can choose what data is stored in an iCloud Backup, and apps like Signal deliberately refuse to store anything at all, which is why you basically have to set it up from scratch when switching to a new iPhone.
After all, the best way to keep your data from falling into the wrong hands is to avoid keeping it in the first place.