Apple vs. FBI: How This Battle Impacts You

Stephen Lam/ Getty Images

Stephen Lam/ Getty Images

Privacy is both a principle and a product to Apple. So the company has a lot to lose in a developing battle with the FBI, which has obtained an order instructing Apple to help federal investigators access the data stored on an iPhone belonging to one of the suspects in the San Bernardino, California mass shooting. It’s a fight that’s about a lot more than the single iPhone in question, both for Apple and for the millions of people who use its devices.

If you haven’t been keeping up with the sequence of events, here’s a quick recap of what happened: A magistrate judge in California ordered Apple to help the FBI (PDF) gain access to an iPhone used by one of the suspected San Bernardino shooters. The FBI recovered the phone in December, but Apple’s encryption technology prevents the agency from accessing the iPhone’s contents. The agency doesn’t want Apple to unlock the phone directly, something that Apple can’t do. Instead, it wants the company to write and install a custom version of the iPhone’s software that would disable the feature that makes it more difficult to guess the phone’s password.

As Dawn Chmielewski reports for Re/code, the primary feature that the FBI wants to bypass is one that would automatically erase the data on the phone after 10 incorrect attempts at entering a password. By removing security features and adding new capabilities to an alternate version of the operating system, Apple would enable a passcode to be input electronically and make it easier for the agency to unlock the iPhone by brute force, trying thousands or millions of combinations via a computer. Apple has refused, posting a strongly-worded letter about how its compliance would set a dangerous precedent of undermining “the very freedoms and liberty our government is meant to protect.”

How the iPhone’s encryption and passcode work

As Vox’s Timothy B. Lee explains, the court ordered Apple to make it easier for the FBI to guess the passcode because it would currently be very difficult to access the phone otherwise. The encryption chip on the iPhone uses a powerful algorithm to protect the data stored on the phone. Each iPhone has a unique encryption key that scrambles or unscrambles the phone’s data. That key is 256 bits long — a string of 256 ones and zeroes — which means that there are a trillion trillion trillion trillion trillion trillion possible encryption keys. Guessing every possible encryption key until you find the right one would take “many lifetimes even if every computer on the planet were working on the problem.”

But Apple doesn’t keep copies of iPhone keys, so the FBI wants to exploit the weakest link in the iPhone’s security: the passcode that the owner uses to unlock the iPhone. The encryption chip won’t function until the correct passcode is entered. That passcode is only four or six digits long, and thus has only 10,000 possible values (1 million if you’ve enabled alphanumeric passcodes). If you’re trying to access an iPhone’s content, it’s much faster to guess the passcode than the underlying encryption key. But the FBI needs Apple’s help, since an iPhone tries to protect against brute force attacks by delaying its acceptance of additional guesses if you’ve entered a passcode incorrectly several times, and by enabling users to switch on an auto-erase feature, which permanently disables access to the encrypted data by deleting the information that’s needed to unscramble it.

In court filings, U.S. attorney Eileen M. Decker argues that the government needs Apple’s help to answer critical questions such as who Syed Rizwan Farook and his wife Tashfeen Malik communicated with to plan and carry out the shootings, and where those people traveled before and after the incident. The FBI hasn’t been able to unlock the device for fear of setting off the auto-erase feature and losing access to evidence the phone might contain.

What the FBI wants Apple to do — and why it doesn’t want to

The government argues that Apple has the ability to modify the phone’s software to turn off the auto-erase feature and disable delays between passcode guesses, and allow passcodes to be entered electronically without changing the encryption of the iOS 9 operating system that powers the iPhone 5c in question. Magistrate Judge Shari Pym ordered Apple to create such software and provide it to the FBI as a version that could be loaded onto the iPhone, since an iPhone will accept new firmware via a USB cable if it has a valid signature from Apple. But Apple doesn’t want to create this software and the backdoor it would add to the iPhone.

While the order suggests that this alternate version of the iPhone’s software “would only load and execute on the subject device,” Cook writes that it’s impossible to guarantee that the backdoor created to assist in the San Bernardino investigation would be used only in this one case. “Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes.” He adds, “No reasonable person would find that acceptable.”

As Lee reports for Vox, no one is particularly worried about the privacy rights of a dead terrorism suspect. But as FBI officials argue that encryption inhibits law enforcement’s ability to investigate serious crimes, Apple, other tech companies, and civil liberties groups “see this case as an opening blow in a much larger government effort to undermine the security of their customers’ smartphone data.” Cook wrote in his letter that “The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals.”

What’s next?

The FBI argues that there’s ample precedent for its request, since police agencies have long asked telephone companies, banks, and landlords to help them spy on criminal suspects. The problem is that the agency is using a particularly awful criminal case to set a new precedent that would have broad implications. Once engineers have created the software, law enforcement requests for the same service could become routine. The current order could open the door to more problematic requests from law enforcement. And as Ashley Carman reports for The Verge, the case could force Congress to finally tackle the question of encryption, perhaps with the terrifying effect of passing an encryption bill that would require companies to build devices with backdoors built in, which would make those devices more vulnerable to hackers.

The iPhone in question is an iPhone 5c, and since that model was introduced, Apple has added more protections to subsequent models. It’s certainly going to continue to add even more security features. The latest iPhone features a Secure Enclave, which would hamper the kind of attack that the FBI wants to perform because it keeps its own record of incorrect passcode guesses and gets slower and slower at responding to each attempt. There are no changes that Apple could make in iOS to get around the Secure Enclave. In the future, Apple could make an iPhone that it has no way of unlocking.

Apple and other companies are facing increased demands from the government to build backdoor access into their devices — demands that they’re fighting not only to stand for the principles of security and privacy, but also to protect their reputations and reassure consumers that they and their devices are trustworthy. If you’re concerned about the security of your smartphone, and its vulnerability to surveillance, this is a case you should follow.

More from Gear & Style Cheat Sheet: