The latest high-stakes standoff between Apple and the FBI has come to an end. After claiming for months that Apple alone could unlock the two iPhones of Pensacola, Florida shooter Mohammed Saeed Alshamrani, the agency announced today that it had managed to do so without Cupertino’s help—and without undermining the encryption that protects over 1 billion iOS devices worldwide.
The détente comes five months after the attack last December at Naval Air Station Pensacola, in which Alshamrani killed three people and wounded eight more before being shot and killed by local law enforcement. The FBI recovered Alshamrani’s iPhone 5 and an iPhone 7 Plus in the wake of shooting; the devices were badly damaged, which the Justice Department implied in January made it more difficult to break in through traditional methods. The stance was always curious. The FBI confirmed it had managed to get the iPhones up and running, and has access to forensics tools from companies like Cellebrite that claim the ability to break into any iOS device. Older models like Alshamrani’s should have been relatively trivial to crack. But as with the 2015 San Bernardino, California shooting, the high-stakes case proved all too tempting for the agency to try to set a bad precedent.
“Every time there’s a traumatic event requiring investigation into digital devices, the Justice Department loudly claims that it needs back doors to encryption, and then quietly announces it actually found a way to access information without threatening the security and privacy of the entire world,” says Brett Max Kaufman, senior staff attorney at the American Civil Liberties Union. “The boy who cried wolf has nothing on the agency that cried encryption.”
In both instances, the FBI wanted Apple’s help to establish a “back door” that would allow law enforcement to circumvent any iOS device’s encryption and access its data as needed. The false claims made about our company are an excuse to weaken encryption and other security measures that protect millions of users and our national security,” Apple said in a statement Monday. “It is because we take our responsibility to national security so seriously that we do not believe in the creation of a backdoor—one which will make every device vulnerable to bad actors who threaten our national security and the data security of our customers. There is no such thing as a backdoor just for the good guys, and the American people do not have to choose between weakening encryption and effective investigations.”
In a press conference today, FBI director Christopher Wray said that the agency had to develop its own tool to access the iPhones. “We canvassed every partner out there and every company that might have had a solution to access these phones. None did,” said Wray. “So we did it ourselves. Unfortunately the technique that we developed is not a fix for our broader Apple problem. It’s a pretty limited application.”
It’s unclear what that difficulty stems from. While still plenty secure for the average user, recent vulnerabilities in iOS have given hackers and forensic investigators ample avenues to break into iPhones. “If the FBI was able to repair the hardware sufficiently to boot them up, then existing forensics tools are more than capable of recovering data from those devices,” says Dan Guido, founder of cybersecurity firm Trail of Bits. He points specifically to the so-called checkm8 exploit, publicized last September—an unfixable flaw that makes it possible to “jailbreak” any iPhone from 2011 to 2017—which includes both of Alshamrani’s devices.
“The FBI could try as many PIN codes as they wanted until one worked,” says Guido, whose iVerify security app can tell if your phone is exposed to checkm8. “It was only a matter of time before they succeeded.”
In fact, iOS has seen several security lapses lately that, while largely harmless to the average user, make it possible for well-resourced technicians to break into devices. In addition to checkm8, vulnerability broker Zerodium recently announced that due to a glut of iOS and Safari bugs it wouldn’t accept certain classes of Apple bug submissions for the next several months.
“There’s been a proliferation of iOS vulnerabilities recently,” says Johns Hopkins University cryptographer Matthew Green. “There was a brief period around 2015 when Apple’s security outpaced the commercially available exploit market, and that period seems to be over.”
It’s unclear exactly how the FBI got the passcodes it needed. But the agency’s success in cracking the iPhones in its possession seems to undermine its central argument that Apple and other companies allow criminals to “go dark” by providing strong encryption on consumer devices. As in 2016 with the San Bernardino case, agents got in eventually.
“Using a device with known security flaws, like the iPhone 7 Plus, or a device without the latest security features, like an iPhone 5 which lacks the Secure Enclave, is a straightforward way to ensure law enforcement can access your phone when needed,” adds Guido.
That may explain why the tenor of both Wray and Attorney General William Barr’s argument against encryption appeared to have shifted slightly. Rather than decrying the impossibility of gaining access, both Barr and Wray focused today on the investigatory costs of how long it took to do so. “The delay from getting into these devices didn’t just divert our personnel from other important work. It also seriously hampered this investigation,” said Wray. “Finally getting our hands on the evidence Alshamrani tried to keep from us is great, but we really needed it months ago, back in December, when the court issued its warrants.”
That timeline’s not quite right. Apple did respond to those early warrants, handing over what it describes as gigabytes of iCloud, account, and transactional data related to the case. The FBI didn’t tell Apple that there was a second iPhone, or that it was unable to access either device, until January 6. It’s unclear how much of the data the FBI found on Alshamrani’s devices had already been available through iCloud backups.
Despite the FBI’s repeated success in breaking into supposedly uncrackable iPhones, Barr insisted that Apple could design a back door that didn’t threaten to compromise iOS devices more broadly. “There is no reason why companies like Apple cannot design their consumer products and apps to allow for court-authorized access by law enforcement while maintaining very high standards of data security,” Barr said at today’s press conference. In fact, the landmark cryptography paper “Keys Under Doormats” by Bruce Schneier, among others, gives ample reasons why they can’t do that very thing.
Barr also signaled, though, that the Justice Department may no longer consider the courts as the best avenue to achieve that end. “The developments in this case demonstrate the need for a legislative solution,” he said, at another point suggesting that undermining encryption is a choice that Americans must make “through their representatives.”
Even so, all the FBI has proven today is that the choice remains moot. Weakening iOS encryption would threaten over 1 billion devices unilaterally. Why force that, when so many of them have vulnerabilities that sophisticated forensics labs can already exploit?
“I think the idea that iPhones are ‘unhackable’ is obsolete,” says Green. “I think we all need to adjust our expectations accordingly, particularly when governments demand that firms break or weaken their encryption.”
The Justice Department has more targets than just Apple; it has increasingly focused on Facebook’s encryption as an investigatory impediment as well. But as long as it’s this manageable to break into most iPhones, its complaints seem less urgent than ever.
This story has been updated with comment from Apple.