New York Has a $10 Million Lab Dedicated to Hacking Into iPhones
Toggle Dark Mode
A new inside look at a cybercrime lab in lower Manhattan reveals the kind of lengths that law enforcement feels that it must go to in order to gain access to iPhones that are seized as part of criminal investigations, as well as offering more insights into exactly how tough the security measures are that Apple employs to protect the privacy of its users.
A profile by Fast Company takes a look inside the $10 million high-tech forensics lab that was developed by Manhattan District Attorney Cyrus Vance, a long-time and very vocal critic of Apple’s iPhone encryption policies who has gone so far as to call the iPhone “a gift from Apple to sex traffickers.”
To be fair, Vance and his team of forensic investigators are more often than not stymied by Apple’s security, which is understandably frustrating for a team whose job it is to bring criminals to justice. Vance told a Senate committee last month that his lab sees about 1,600 devices each year, of which 82 percent are locked, and that they can only bypass the security on about half of them, resulting in many serious cases that can’t be prosecuted due to a lack of evidence, and while that’s something of a strawman argument — it’s difficult to say whether accessing the iPhones would provide any additional evidence — it’s obvious that Vance is determined to do everything he can to find out.
The Lab
The “lab” that Vance refers to in his comments is a facility located in the Lefkowitz Building in lower Manhattan, and as Fast Company points out, it “looks like an artifact from the Apollo program.” A radio frequency isolation chamber is used to shield the seized Phones from receiving signals that could cause them to be remotely wiped, and is protected by two airtight — and RF-shielded — magnetic doors that form the sort of “radio frequency airlock” that’s usually only seen in some of the most secure military communications facilities, and is designed to make it absolutely certain that no stray emissions are able to get in or out of the facility.
Inside the room are dozens of Apple iPhones and iPads that have been confiscated as evidence in various criminal investigations, and many are in various states of disrepair, offering additional challenges to the forensic team, which may often need to restore them to working order before they can even begin trying to break into them. In addition to supercomputers that actually try to break into the iPhones, the lab includes a number of specialized tools for repairing damaged devices, including a robot that can remove memory chips without using heat.
The actual hacking is done using brute force methods; there’s no magic solution to break Apple’s actual encryption, so instead investigators use two powerful computers to generate random passcode combinations in an attempt to find one that works, leveraging technology from devices like the GrayKey box that prevent the iPhone from wiping itself after 10 failed passcode attempts.
How long it takes the team to do this for each iPhone depends largely on the length and complexity of the password. A four-digit numeric passcode requires only 10,000 combinations to be tried — usually less since you’re likely to hit it before you’ve tried all of them — and can be cracked in an average of seven minutes. However, Apple has required six-digit passcodes by default since 2015, which offer one million possible combinations, taking several hours to guess by brute force. Newer iPhones that use Apple’s Secure Enclave chip can taken even longer due to the additional time it takes for the Secure Enclave to process each passcode attempt.
Steve Moran, the director of the High Technology Analysis Unit, told Fast Company that he also tries other methods to narrow things down, since many users don’t use completely random passcodes, so things like birthdays or favourite athletes can provide clues.
Do they like the Mets? Do they like the Yankees? Is their favorite player Derek Jeter? Is their favorite player Mickey Mantle? What’s the dog’s name? What’s the kid’s birthday? What’s their birthday? Where did they get married? What date did they get married? We are looking for any edge that we can try to find.
The team also needs to decide which iPhones to prioritize, since there is almost always a backlog of hundreds, or even thousands of iPhones related to active investigations that the company can’t access. This involves not only escalating the most important cases, but also determining which third-party solutions will open up access to the largest number of iPhones.
iCloud Backups Aren’t Enough
While Apple regularly provides data from iCloud Backups in response to court orders, Vance says that this is usually insufficient in criminal investigations, since serious criminals don’t back up their devices in the first place, and private messaging apps like WhatsApp, Signal, and Telegram are specifically designed not to store their data in iCloud, but only on the local device.
Further, as Moran points out, iCloud Backups don’t necessarily occur in the time when a crime takes place and a suspect shuts off their iPhone. In fact, a backup may not have occurred for several hours or even days before the iPhone is seized, since iCloud Backups by default only run automatically once every 24 hours, and only when the iPhone is connected to Wi-Fi and plugged into a power source.
Vance, who has long been pushing for companies like Apple to create a backdoor that would allow unfettered access to law enforcement, told Fast Company that he believes that Apple already has a secret backdoor and that it’s simply refusing to share it with law enforcement.
They [Apple] get into my phone all the time because they upgrade my operating systems and they send me messages.
On the other side, Apple of course continues to maintain that no such backdoor exists, and that creating one would be “the software equivalent of cancer” as there would be no way to guarantee that it could only be used by law enforcement officials in legitimate cases.
However, Vance insists that it’s “not fair” that Apple and Google can set unilateral rules that block law enforcement from doing their job. “That’s not their call,” he says, because “there’s something bigger here at issue” than their own opinions on privacy and public safety, and that it needs to be balanced equally with giving law enforcement what it needs to actually do its job of ensuring that not only are those guilty of committing crimes forced to properly answer for them, but that those accused of crimes they didn’t commit aren’t unfairly prosecuted when there’s evidence available that could exonerate them.