The battle over privacy and encryption that’s once again brewing between Apple and the FBI in the case of the Pensacola shooter is not only bringing back the debate between law enforcement and personal privacy, but it also serves as an important reminder of how iPhone security works and what you can do to maximize the security on your device.
The current situation that Apple is now facing with the FBI and U.S. Justice Department began on Dec. 6 when a Saudi Air Force Cadet who had been training with the U.S. military, Mohammed Saeed Alshamrani, carried out a shooting attack at the Naval Air Station in Pensacola, Florida, killing three people and wounding eight.
The request for Apple’s involvement didn’t become public until a month later, when the FBI officially asked for Apple’s help unlocking the two iPhones owned by the shooter, however after U.S. Attorney General William P. Barr entered the fray and accused Apple of offering “no substantive assistance,” the company quickly responded by pointing out that it had in fact been working with the FBI to supply whatever data it could — mostly in the form of iCloud backups and account information from Apple’s servers — since the very day that the shooting occurred. It wasn’t until Jan. 6 that the FBI asked for Apple’s help actually getting into the iPhone in question, while also revealing that there was a second iPhone involved that Apple wasn’t previously aware of.
The Mystery of the FBI’s Need for Apple
One of the most unusual questions that’s come up over the past week is exactly why the FBI needs Apple’s help. Both of the iPhones in question were considerably older models — an iPhone 5 and an iPhone 7 Plus — which should be easily accessible with modern forensic tools, not to mention that they’re both vulnerable to a major exploit discovered last year. Most security experts agree that it should be trivial for the FBI to get into either iPhone using tools that are already available, and in fact a new report revealed that the FBI has successfully gained access to an iPhone 11 Pro Max last fall.
A somewhat cynical point of view is that the U.S. Justice Department and Trump administration are trying to use this as a way of forcing Apple’s hand by turning public opinion against the use of strong encryption by Apple and other big tech companies, and certainly the fact that President Donald Trump has personally gotten involved would seem to support this notion, especially in light of the U.S. Senate’s increasing demands that companies build backdoors into their devices for law enforcement access.
However, other possible explanations exist, including the fact that both of the iPhones in question were apparently damaged in the incident, which may be affecting the FBI’s ability to use the standard forensic and hacking tools with them. Reports indicate that the FBI was able to repair both iPhones to the point where they would power on, but it’s difficult to say whether there’s other damage that may be hampering the efforts to access the data on both of the iPhones. The FBI explained that it only approached Apple after exhausting numerous other possibilities, so it does seem like FBI investigators may have hit a wall.
Brute Force Is Needed
There’s another consideration here, however, and that’s actually the fact that none of the forensic tools available actually break the encryption on the iPhone. That’s simply impossible in all practical terms due to the the nature of strong encryption — it would take well beyond billions of years for even the most powerful supercomputer on the planet to crack. Even Apple itself cannot break the encryption without knowing the key, which is derived from the user’s passcode.
The only realistic way into an encrypted iPhone is to know — or guess — the user’s passcode, and in a criminal investigation, unless FBI agents are lucky enough to find the passcode taped to a sticky note on the suspect’s desk, this generally requires “brute forcing” the passcode — basically guessing every possible combination until you find the one that works.
In fact, even the FBI realizes this, and in the case of the San Bernardino shooter back in 2016, they weren’t asking for Apple to break the encryption — they know that’s impossible — but rather asking Apple to create a custom version of iOS that would make it easier for them to try every possible password attempt.
The problem is that as an added security measure, iOS prevents users from guessing more than 10 passcodes before the device is erased (and even before that, you’ll have to wait an increasingly longer time between each attempt). Since investigators are unlikely to hit the correct passcode in less than 10 tries, this is where devices like the GrayKey box come in — they exploit flaws in iOS to get around the limit of 10 password attempts, and then proceed to hit the iPhone with every possible password combination until they find the one that actually unlocks it. At that point, everything on the iPhone becomes accessible just as if the correct passcode had been entered in the first place.
Longer Passcodes Are More Secure
However, as Jack Nicas explains in The New York Times, this could be made considerably more difficult if the Pensacola shooter was using a longer passcode. The longer the passcode, the more possible combinations of numbers and even perhaps letters exist, and since the forensic tools have to run through every possible passcode or password combination, the longer the time it takes to guess the correct one.
That approach means the wild card in the Pensacola case is the length of the suspect’s passcode. If it’s six numbers — the default on iPhones — authorities almost certainly can break it. If it’s longer, it might be impossible.Jack Nicas, The New York Times
In fact, as Nicas explains, here’s how long it takes on average to break into an iPhone when a passcode contains only the numbers 0-9:
- Four digits: 7 minutes
- Six digits: 11 hours
- Eight digits: 46 days
- Ten digits: 12.5 years
The previous default on older iPhone models was four digits, however since iOS 9, users have been prompted to use six-digit passcodes by default, although they can still manually choose to go back to a four-digit passcode, or choose to use a longer numeric code or an alphanumeric password instead.
In fact, if a user has chosen to use an alphanumeric password, the amount of time required for a hacker to brute-force their way into the iPhone increases significantly: a simple six-character password would take an average of 72 years to guess — and that’s just using letter and numbers, not symbols — and it goes up exponentially from there. Add only two more letters or numbers, and the average time to brute-force the password increases to 288,000 years.
While this may seem like a long time considering how fast modern computers can churn out passcodes, in the case of the iPhone the attempts are slowed down by the fact that it takes 80 milliseconds for the iPhone’s Secure Enclave — the hardware chip that stores all of the encryption keys — to process each passcode attempt. As Nicas points out, this hampers a brute force attack that could otherwise try thousands of passcodes a second down to only about 12 tries per second.
From there the math is pretty straightforward: A four-digit passcode has 10,000 possible combinations (0000-9999), and would therefore take a maximum of 833 seconds, or about 14 minutes to guess. Of course, since you’re likely to hit the correct passcode before you’ve tried every single possible combination, the average time is takes to break into an iPhone is half of that.
So in the case of the Pensacola shooter, “he might have just picked a good passcode,” as Matthew D. Green, a cryptographic professor at Johns Hopkins University points out. Since the terrorist attack was clearly premeditated, and the shooter even deliberately tried to destroy one of his iPhones to prevent investigators from gaining access, “it’s entirely possible he did his research and planned ahead.”
The problem is that if that’s the case, Apple is not going to be able to help the FBI even if it wanted to, and most security researchers, some of whom are former Apple engineers that have gone on to start their own forensic companies, generally agree that there’s nothing Apple special that Apple can do that third-party forensic tools can’t already accomplish by themselves. “It’s just something that’s going to take time to crack,” says Dan Guido, head of iPhone security research firm Trail of Bits.
As John Gruber points out at Daring Fireball, while iOS can be and has been hacked — to bypass the limit on passcode attempts, for example — the Secure Enclave is a hardware component, and the limitations of it are not changeable.
it’s the Secure Enclave that evaluates a passcode and controls encryption, and the 80 millisecond processing time for passcode evaluation isn’t an artificial limit that could be set to 0 by hackers. It’s a hardware limitation, not software.John Gruber, Daring Fireball
While the 80 millisecond processing time obviously helps to improve security, it’s likely not something that was deliberately created by Apple, but rather a function of how the encryption works. Keep in mind that the encryption key is derived from the passcode — that is, the passcode is used to unlock and generate the key itself, which takes some time due to the complexities of cryptographic math. This is not simply a matter of checking the entered passcode to see if it matches what’s stored in the Secure Enclave.
Unfortunately, this is exactly why lawmakers are pushing for companies like Apple to create a backdoor that would allow them to easily gain access to encrypted iPhones. The problem, of course, is that due to the way that encryption works, a back door would essentially have to take the form of a “master key” that could be used to unlock any iPhone on the planet, which would be a huge security risk should such a key ever fall into the wrong hands — which it inevitably would. While there are arguably more complex and secure ways for companies like Apple to accomplish this, it doesn’t change the fact that once any backdoor system like this is built, that very same backdoor is open to all sorts of new security issues and exploits.
What This Means For You
By now it should be obvious that it’s trivial to hack an iPhone that’s using a four-digit passcode, and not really all that much harder for a determined hacker to get into an iPhone secured by a six-digit passcode, and this capability isn’t exclusive to the FBI — criminals can get at the data on your iPhone just as easily.
So if you’re concerned about keeping the data on your iPhone private, we strongly recommend that you choose a longer passcode, or better yet, an actual password. While four-digit passcodes were once a necessity of convenience — who wants to key in a longer number of word every time you pull your iPhone out to check your email or Facebook — the advent of Touch ID and Face ID has made it really easy for users to pick more secure passwords, since you’ll rarely have to actually enter it into your iPhone, which is actually the very point of Touch ID and Face ID — not that these systems are inherently more secure than using passcodes, but rather than they offer better security through convenience by allowing users to select more secure passcodes.