Cops Don't Need a Crypto Backdoor to Get Into Your iPhone

The White House has denied the FBI's pleas for an encryption backdoor. But don't forget that feds can still sneak in through the window.
iPhone6Security1
Then One/WIRED

Late last week, the privacy community scored a victory in a year-long battle over the future of encryption: In internal discussions, the White House quietly overruled law enforcement and intelligence officials, deciding that it won't pursue a policy of pushing tech companies to put "backdoors" in their encryption that would allow government agencies to access decrypted private data. That's going to make it harder for the FBI to access private data, but they've still got plenty of other ways in.

To judge by FBI director James Comey's warnings to Congress and the public, last week's decision pushes us one step closer to a world where police surveillance "goes dark," encryption reigns supreme, and pedophiles and drug dealers enjoy perfect immunity from the cops. But before surveillance hawks prophesy doomsday or privacy doves celebrate, let's remember: For better or for worse, encryption usually doesn't keep determined cops out of a target's private data. In fact, it only rarely comes into play at all.

In 2014, for instance, law enforcement encountered encryption in only 25 out of the 3,554 wiretaps it reported to the judiciary---about .7 percent of cases. And of those meager 25 incidents, investigators circumvented the encryption to access the target's unencrypted communications 21 times.

"In spite of the big words the FBI has used over the last year, the situation isn't quite as dire as they make it out to be," says Chris Soghoian, principal technologist for the ACLU. "The kind of encryption tech companies are giving us is geared towards protecting us from a thief stealing our laptop. It's not designed to keep out a government agent trying to get your data with or without a court order."

Take Apple, which has become the enemy number one in the FBI's anti-encryption rhetoric since it introduced default disk encryption for all of its phones last year. FBI director Comey has compared a default-encrypted iPhone to a "closet that can't be opened" even in as extreme a situation as, say, a kidnapping investigation. "Sophisticated criminals will come to count on these means of evading detection," Comey said in a speech at the Brookings Institution last year. "And my question is, at what cost?"

But despite the iPhone's title as the highest-security smartphone---or even consumer-focused computer of any kind---it still offers significant cracks for the cops to exploit, says Nick Weaver, a security researcher at Berkeley's International Computer Science Institute. "The iPhone is the hardest target, but in practice law enforcement can find a way in," Weaver says. "There are three or four ways into the typical iPhone. It takes someone really paranoid to have closed all of them."

As a reminder that the crypto backdoor debate isn't the beginning and end of digital privacy, here are a few of the de-facto backdoors that still leave private data open to any law enforcement that seize a locked, encrypted iPhone:

  • Wide Open iCloud: A modern iPhone encrypts its storage by default, but sends much of that sensitive data to the user's iCloud backup by default, too. If the user hasn't disabled that automatic uploading, police can subpoena Apple for its cloud-based data, including the suspect's photographs and iMessages. "iCloud backup is a disaster unto God and man," says Weaver. "It has no security at all against an arrest. They call Apple with a warrant and get a whole host of information."
  • Fingerprinting: Cops have long taken the fingerprints of arrestees. Now, instead of pressing a suspect's fingers to an inkpad, police can press them on that suspect's iPhone's TouchID fingerprint reader to immediately unlock it. When cops demand a password, a suspect can invoke the Fifth Amendment's protections against self-incrimination to avoid giving it up. But within the first 48 hours before an iPhone's TouchID automatically disables, an iPhone user has no such protection for their unique loops and whorls. "If your threat model is theft, the fingerprint reader is brilliant," Weaver says. "If your threat model is coercion by a government authority, it’s worse than useless."
  • Laptop Exposure: If cops can't get onto an encrypted phone, they may have more luck with the suspect's laptop. There they often find unencrypted backups of the phone. Or, as iOS forensics expert and security consultant Jonathan Zdziarski points out, they can retrieve a so-called "pairing record," the key that's stored on your computer that tells a phone it's a trusted PC. With that stolen pairing record, cops can sync your phone with their computer and offload your sensitive data.
  • __Leaky Siri:__If a suspect won't squeal, Siri sometimes will. iPhones have Siri enabled from the lock screen by default, and even from the lock screen it will answer queries for the user's most recent incoming or outgoing call, contacts, and even their entire calendar. "This isn't so much of a backdoor as an information leak," says Zdziarski.
  • Breaking In: If law enforcement can't find an open door into a phone, it may be able to break and enter. A fully functioning remote zero-day exploit for an iPhone sells for around $1 million, but ones that target phones with outdated software may be more accessible. Just last month, for instance, security researcher Mark Dowd found a method of breaking into any iPhone via its Airdrop bluetooth connection. Apple quickly patched the flaw. But any criminal target who hasn't kept their phone updated has left a wireless entry way into their phone's sensitive data.

For each of those vulnerabilities, users can turn off a default feature or take an extra precaution to keep out the cops. But few iPhone owners---even sophisticated criminals---are likely to be so careful. "Apple has done a really great job of locking phones down," says Zdziarski. "But it still requires a security-conscious user, and there are still ways to screw it up and leave yourself exposed."

The FBI and the NSA will no doubt continue to press for encryption backdoors, and they'll likely try their luck again with the next presidential administration in 2017. In the meantime, they'll have to stop berating Apple, and instead rely on the more dependable backdoor recipe: technological complexity and old-fashioned human carelessness.