When law enforcement argues it needs a “backdoor” into encryption services, the counterargument has typically been that it would be impossible to limit such access to one person or organization. If you leave a key under the doormat, a seminal 2015 paper argues, a burglar eventually find its. And now recent events suggest an even simpler rebuttal: Why entrust a key to someone who gets robbed frequently?
This aptly describe US intelligence services of late. In March, WikiLeaks released nearly 9,000 documents exposing the CIA’s hacking arsenal. More so-called Vault 7 secrets trickled out as recently as this week. And then there’s the mysterious group or individual known as the Shadow Brokers, which began sharing purported NSA secrets last fall. April 14 marked its biggest drop yet, a suite of hacking tools that target Windows PCs and servers to devastating effect.
The fallout from the Shadow Brokers has proven more concrete than that of Vault 7; one of its leaked exploits, EternalBlue, facilitated last month’s WannaCry ransomware meltdown. A few weeks later, EternalBlue and two other pilfered NSA tools helped advance the spread of Petya, a ransomware outbreak that looks more and more like an act of cyberwar against Ukraine.
Petya would have caused damage absent EternalBlue, and the Vault 7 dump hasn’t yet resulted in a high-profile hack. But that all of this has fallen into public hands shifts the nature of the encryption debate from hypothetical concern that someone could reverse-engineer a backdoor, to acute awareness that someone could just steal it. In fact, it should end any debate all together.
“The government asking for backdoor access to our assets is ridiculous,” says Jake Williams, founder of Rendition Infosec, “if they can’t first secure their own classified hacking tools.”
If you think about the encryption debate at all, it’s likely in the context of the 2016 showdown between the FBI and Apple. The former wanted access to San Bernardino shooter Syed Rizwan Farook’s locked iPhone; the latter argued that writing special code to break its own security measures would set a dangerous precedent.
That case ended in something like a draw. The FBI paid an outside company to break into the iPhone, quitting the court case before either side got a definitive ruling.
Apple facing off against the FBI was certainly high profile, but it only amounted to one skirmish in a long-fought encryption war. In the wake of the March terrorist attack by Khalid Masood outside the British parliament, UK home secretary Amber Rudd called for police and intelligence agencies to have access to encrypted messaging services like WhatsApp. British prime minister Theresa May struck a similar chord following a terror attack in London earlier this month.
In fact, you needn’t look even that far back to see encryption under duress. Five Eyes, the intelligence-sharing alliance of the US, UK, Canada, Australia, and New Zealand met just this week to discuss their national security priorities. “We committed to develop our engagement with communications and technology companies to explore shared solutions while upholding cybersecurity and individual rights and freedoms,” the group wrote Tuesday morning, pushing for an encryption compromise that does not technologically exist.
A few hours later, reports began to emerge that Petya was wending its way through networks around the world, thanks in part to exploits that the NSA failed to secure.
“I think Vault 7 and Shadow Brokers illustrate the challenges that even intelligence agencies have in securing extremely sensitive information,” says Andrew Crocker, staff attorney with the Electronic Frontier Foundation. And it’s hard to think of information that would be more sensitive than special access to the world’s encryption protocols.
The intelligence community’s apparent inability to keep its secrets appears bad enough on its face. But remember that Vault 7 and Shadow Brokers are simply the thefts that have gone public.
“It hints at a much larger problem of nation-states probably taking these exploits from each other and sitting on them, to analyze them and use them defensively,” says Drew Mitnick, policy counsel at digital rights group Access Now. “If there were an encryption backdoor tool that were compromised by nation-states, we might not know. It might not become public in the way these recent attacks was.”
It would certainly provide a high-profile target. Any sort of publicized encryption backdoor—mandated, say, through legislation—would draw the immediate attention of foreign powers, bad actors, and basically any hacker looking for the keys to kingdoms that are, in some cases, billions of users strong. If they acquired them, well, game over.
“The dangers posed by leak or theft of keys used in a key escrow system, for example, are potentially catastrophic,” says Crocker, referring to a potential method by which the government could access an encryption backdoor.
“If a hacker were to compromise a significant encryption platform, we could see something much worse than the WannaCry ransomware attack,” says Mitnick. WannaCry froze up hundreds of thousands of computers; WhatsApp, which uses Open Whisper Systems’ Signal Protocol, has well over a billion users with default, end-to-end encrypted chat. The implications come into even sharper relief when you consider countries where access to encrypted chat provides the best defense against oppressive regimes.
The NSA and CIA’s recent misadventures in securing their wares is just one among many points in favor of encryption. After months of spy agency tools gone rogue, though, the only argument needed should be a lesson you probably learned in junior high: Don’t share secrets with people who can’t keep them.