Apple’s Principled Stand

on February 18, 2016

On Tuesday evening, a magistrate judge at a United States District Court in California issued an order compelling Apple to assist FBI agents in breaking into the phone used by one of the suspected shooters involved in the San Bernardino shootings in December 2015. Apple has formally objected to the order, explaining in a public letter to customers over Tim Cook’s signature why it feels this would be a dangerous step. Reactions to the situation have been somewhat predictable, with those on both sides adopting familiar positions. In reality, the situation is fairly nuanced, and that nuance is largely being missed.

Apple’s stance on encryption is clear

The current case is certainly not the first glimpse we’ve had into Apple’s stance on privacy, security, or even technical issues such as encryption. Since taking over as Apple CEO, Tim Cook has made privacy and security major elements of Apple’s positioning and differentiation and he’s hammered these themes repeatedly, including in a previous letter to customers on privacy specifically. On encryption, Tim Cook has been one of the most vocal and strident opponents to the idea that governments should have backdoors to bypass encryption and gain access to devices. The reason for that stance, in turn, is clear: giving one entity a backdoor potentially gives any entity similar access, should the tools involved fall into the wrong hands. It also sets a precedent in which Apple might feel obligated to provide any government around the world the same tools which it provides to one government or to begin to pick and choose which governments and jurisdictions’ requests it will honor, which is itself a slippery slope.

This case isn’t about encryption per se

What’s tricky is this case isn’t about encryption per se. Tim Cook seeks to tie the FBI’s request to the broader issue of encryption by painting both with the “government backdoor” brush. However, this case is actually about brute forcing a passcode and not about encryption itself. As others have written, this wouldn’t even be possible on newer devices which include the Touch ID sensor and the associated Secure Enclave and the encryption protections that go with them. But this case concerns an iPhone 5C which doesn’t have those elements. However, what ties encryption and this case together is that, in both cases, governments want Apple to create software that lets law enforcement circumvent security protections on these devices, hence the “backdoor” phrase. It’s arguably nitpicking to debate whether the backdoor is permanently left open or whether law enforcement needs Apple to unlock it every time it’s used.

An order for access to a specific device

The FBI has, however, asked specifically for Apple to assist it in accessing a single device, rather than to provide a blanket backdoor. Both the Bureau and the White House have suggested this negates Apple’s claims that this approach would be applicable to any device at any time. The order even provides for Apple to keep the device in question on its premises while it loads the software and offers remote access to the FBI’s investigators for the purpose of brute forcing the password. In a technical sense, this would appear to make it impossible for the FBI to take the software to hack into this one device and apply it to others, at least without a new warrant.

An issue of precedents

However, the biggest single problem with what Apple is being asked to do in this case is the precedent it sets, both from a strict legal perspective and otherwise. From a legal perspective, once Apple is compelled to provide the FBI with the means to access information on this one device, the precedent will permit it to be compelled to do so again. That applies not just to the technical specifics of this case, but the legal structure under which Apple is being compelled to assist – i.e. creating new software (malware, effectively) which can bypass security protections built into a device. Although this order involves a one-off, after-the-fact solution, it also creates the risk Apple might be compelled to design its standard software in such a way as to make this possible or easier on other devices going forward. Hence, this is the beginning of a slippery slope that could easily lead to just the kind of outcomes Apple is trying to avoid with encryption, even though this case is technically about something else.

Unappealing test case

One of the biggest challenges with this particular case is the specifics make it very unappealing for any other tech company to jump to Apple’s defense. In a case where a reporter was protecting a whistleblower, for example, it might be far easier to garner support from the public for defending her right to privacy and security. I’ve seen quite a few people suggest today the FBI (which favors encryption backdoors) has likely chosen this case as a precedent setter precisely because it’s so hard to argue for the rights of the subject in the case. Although big tech companies have made some supportive comments about encryption over the past year, including a joint letter to President Obama last June, none of them have yet forcefully come to Apple’s defense in this particular case. I suspect that’s a reflection both of their weaker commitment to the general cause and their queasiness about engaging with this specific case.

A principled stand

The fact this case is so unappealing is precisely what makes Apple’s stand a principled one. A stand based solely on the optics of a particular case wouldn’t be worth much at all, but a stand on such a politically charged case shows just how serious Apple is about this issue. Cook makes clear in the letter that Apple shares the government’s aims in bringing terrorists to justice, so this is entirely about the means and not the ends. And Apple’s stance is not just about encryption, but about the inherent privacy and security of Apple’s devices. Apple’s argument is that ordinary people want devices that come with the kind of privacy and security guarantee Apple offers baked in, not because they have any nefarious intent, but simply because they want to protect their private and sensitive information. Tim Cook has argued that terrorists and criminals who want to keep their information out of the hands of law enforcement will always find ways to do so. That argument is backed up by a recent Harvard study on the easy availability of encrypted communication solutions.

The courts, and the court of public opinion

This whole issue is about a court order and ultimately judges will determine whether Apple has to comply with the order as it currently stands. As such, no amount of lobbying or public statements by Apple or others is likely to sway the outcome, which will depend on individual judges’ interpretations of the facts of the case combined with the applicable laws (though I’ve no doubt Apple appreciates the support of the EFF and others who have promised to file amicus briefs). Arguably, therefore, it matters little whether other tech companies jump in on Apple’s side because they likely can’t affect the outcome.

But this case will almost certainly bring to the forefront a debate about the broader issues involved, which is what Apple has wanted all along. Once that happens, I would hope other tech companies will indeed weigh in on the issue and do so far more vigorously than they have so far. The biggest challenge is that this debate will take place in a public sphere in which discussions over complex matters are almost always over-simplified. Already, we have presidential candidates and congressmen weighing in on the issue on both sides, pandering to their bases without any real understanding of the intricacies or the broader implications. Although Apple has wanted a legislative solution all along, it now risks being dragged into a very public battle in which Exhibit A will be this court case about a terrorist’s iPhone, which may make it much tougher to win.