Why You Should Care About Apple’s Fight With the FBI

We may earn a commission from links on this page.

The FBI wants Apple’s help to investigate a terrorist attack. Apple says providing this help is the real danger. We’ve reached a boiling point in the battle between tech companies and the government over encryption. And what happens will affect anyone who uses a smartphone, including you.

After the San Bernardino shootings, the FBI seized the iPhone used by shooter Syed Rizwan Farook. The FBI has a warrant to search the phone’s contents, and because it was Farook’s work phone, the FBI also has permission from the shooter’s employer, the San Bernardino County Department of Public Health, to search the device. Legally, the FBI can and should search this phone. That’s not up for debate. If the FBI gets a warrant to search a house and the people who own it say okay, there’s no ambiguity about whether it can search the house.

Advertisement

But if the FBI comes across a safe in that house, the warrant and permission do not mean it can force the company that manufactures the safe to create a special tool for opening its safes, especially a tool that would make other safes completely useless as secure storage. That’s the situation that Apple’s dealing with here.

Advertisement

The FBI obtained an order from a California district court asking Apple for assistance cracking Farook’s passcode. The court order doesn’t flat-out demand that Apple unlock the phone, which is an iPhone 5C* running iOS 9. Instead, the judge is asking Apple to create a new, custom, terrorist-phone-specific version of its iOS software to help the FBI unlock the phone. Security researcher Dan Guido has a great analysis of why it is technically possible for Apple to comply and create this software. (It would not be if Farook had used an iPhone 6, because Apple created a special security protection called the Secure Enclave for its newer phones that cannot be manipulated by customizing iOS.)

Advertisement

The fight isn’t over whether Apple can comply in this case. It’s whether it should.

If Apple makes this software, it will allow the FBI to bypass security measures, including an auto-delete function that erases the key needed to decrypt data once a passcode is entered incorrectly after ten tries as well as a timed delay after each wrong password guess. Since the FBI wants to use the brute force cracking method—basically, trying every possible password—both of those protections need to go to crack Farook’s passcode. (Of course, if he used a shitty password like 1234, the delay wouldn’t be as big a problem, since the FBI could quickly guess.)

Advertisement

The security measures that the FBI wants to get around are crucial privacy features on iOS9, because they safeguard your phone against criminals and spies using the brute force attack. So it’s not surprising that Apple is opposing the court order. There is more than one person’s privacy at stake here!

Apple equates building a new version of iOS with building an encryption backdoor. CEO Tim Cook published a message emphasizing that the company can’t build a backdoor for one iPhone without screwing over security for the rest:

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

Advertisement

Apple will be writing its own malware if it complies with this order. It would be creating the best tool to break into its own (older) devices.

“Essentially, the government is asking Apple to create a master key so that it can open a single phone,” the Electronic Frontier Foundation wrote in a statement supporting Apple. “And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security.”

Advertisement

Don’t sit there chuckling if you use an Android, by the way. If Apple is compelled to create this malware, it will affect anyone who uses technology to communicate, to bank, to shop, to do pretty much anything. The legal basis for requesting this assistance is the All Writs Act of 1789, an 18th century law that is becoming a favorite for government agencies trying to get tech companies to turn over user data. The AWA is not really as obscure as Apple suggests, but it is a very broad statute that allows courts established by Congress to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”

Advertisement

The Department of Justice has even tried to use it to force Apple to turn over suspects’ messages before. I know 18th century law sounds boring, but this is an 18th century law that could fuck you big time.

The All Writs Act can only force a company to do something if it’s not an “undue burden.” Seems like making Apple create malware that will fundamentally undermine its core security features is an enormous burden. And if it’s not deemed “undue” in this case, that sets a horrible precedent. After all, if compelling Apple to maim itself is allowed, compelling Google and Facebook and Microsoft to write security backdoors would also be allowed.

Advertisement

Correction 1:06pm: *I originally wrote that Farook had an iPhone 5S. He had an iPhone 5c. The post has been updated to fix that error.

Image: Getty

Advertisement