FBI v. Apple: “Unduly Burdensome”

on April 3, 2016
Reading Time: 17 minutes

You might well be asking yourself, if the FBI withdrew its challenge against Apple, then why are we still talking about FBI v. Apple? Well, the San Bernadino case is over, but there are many, many more cases still pending. The ACLU published an interactive map of locations where the FBI is currently using the All Writs Act to demand assistance from Tech companies. You can view it HERE. So yeah, this matter is far from over.

Author’s Note: For an outline of how I think FBI v. Apple will play out, please see HERE.

As I’ve written before, there are two big issues that the FBI must overcome before it can get a court to order Apple to undo its own encryption. The first issue is CALEA — a statute that appears, on its face, to prohibit the FBI from asking for the very thing they’re currently asking for. I discussed CALEA HERE.

Today I focus on the second big stumbling block facing the FBI. If the FBI is going to use the All Writs Act to order Apple to assist them in breaking their own encryption, the FBI must first demonstrate that the requested assistance is not “unduly burdensome”.

One Phone, One Time

The FBI has steadfastly insisted that this case is just about one phone, and that the action they are requesting of Apple would occur only once. That is a crock of manure.

So important is this issue, that I was going to devote an entire article to it. However, since the current case is over, let me just point out that the director of the FBI — during testimony — under oath — before Congress — said:

“(O)f course” the FBI would use the ruling from this case to “return to the courts in future cases to demand that Apple and other private companies assist . . . in unlocking secure devices.”

So much for ‘one phone, one time’.

But is it really all that important? Is the claim that this is just about ‘one phone, one time’ really that big of a deal?

Oh yeah.

In their pleadings, the FBI said that Apple “desperately needs” this case to be about more than just one phone.

Apple desperately wants—desperately needs—this case not to be “about one isolated iPhone.” ~ FBI Pleadings

Someone was desperate, all right, but it wasn’t Apple. The FBI knows that if they are forced to acknowledge that Apple is going to have to comply with similar requests over and over again, then they will also be forced to acknowledge that the burdens that Apple could be expected to endure will expand exponentially.

It was the FBI who desperately wanted — desperately needed — this case to be about one isolated iPhone. That is why they continue to defend their position even when the facts, logic, common sense and the testimony of their own director make their position indefensible. Zealous advocacy is to be commended. Purposefully misstating a material fact is to be condemned.

In just one week, the FBI’s gone from “just one phone” to sharing with the entire law enforcement community. Remember this for the next one. ~ Jonathan Ździarski (@JZdziarski)

Precedent

This case was never about getting information from the phone. It was always about setting a legal precedent that would allow the FBI to force tech companies to build back doors to the FBI’s specifications.

“Ah,” you say, “you can’t prove that. You’re starting to sound like a conspiracy nut.”

Oh yeah? You know who else sounds like a conspiracy nut? Richard Clarke, former national security advisor and head of counter terrorism.

[The FBI] is not as interested in solving the problem as they are in getting a legal precedent,” Clarke said. “Every expert I know believes the NSA could crack this phone. They want the precedent that government could compel a device manufacturer to let the government in.

The FBI director is exaggerating the need for this, trying to build it up as an emotional case … It’s Jim Comey. And the Attorney General is letting him get away with it. ~ Richard Clarke

All the evidence is consistent with the “crackpot conspiracy theory” that the FBI has been systematically trying to compel firms to backdoor their own encryption. At this point, I would venture to say that you have to be a crackpot NOT to believe these theories.

Unduly Burdensome

Duty To Assist Law Enforcement

Even now, I don’t think people realize what this case is all about. Apple did nothing wrong here. They were just being asked to help law enforcement out. Our legal system allows that, but only to a very limited degree.

[pullquote]The government does not hold the general power to enlist private third parties as investigative agents[/pullquote]

While the government can, in some circumstances, require third parties to support law enforcement investigations — for example, by requiring them to produce relevant evidence or give truthful testimony — the government does not hold the general power to enlist private third parties as investigative agents.

Some typical examples of what citizens can be asked to do:

— Produce existing business records;
— Freeze assets and accounts;
— Turn over security footage.

Let’s examine that last example. The FBI can go to a store and ask them for their security footage, but that’s about it. They can’t ask the store owner to stay up all night filming a suspect, and they can’t ask the store owner to install additional cameras in his lunch room, bathroom and boardroom. All of that is way, way, way beyond the call of duty.

What the FBI is asking of Apple is way beyond the call of duty too.

Legal Buzzwords

The Courts have placed severe limitations on what law enforcement can and cannot ask third parties to do. Here are the kind of buzzwords that are seen when reading through the applicable case law:

Must not be “in any way burdensome”; “meager assistance”; “minimal effort”; “no costs will be incurred”; “require minimal uses of company resources”; “no disruption to its operations”; the absence of any conceivable “adverse effect”; “normal course of their business”; “must not adversely affect the basic interests of the third party or impose an undue burden”.

Perhaps now you begin to see how little law enforcement can demand of us, and how free we are to refuse those demands.

Public Utilities vs. Private Entities

Most of the cases cited by the FBI in support of their position concerned highly regulated public utilities, not private companies. And the Courts have gone out of their way to note that much more can be demanded of public utilities — such as phone companies — than of private parties.

Even in those cases where public utilities were required to assist law enforcement, the types of burdens the Courts imposed were not at all as onerous as the one being requested of Apple.

For example, the FBI claimed that the Courts compelled the Mountain Bell telephone company to do programming, so it would certainly be nothing new for the Courts to compel Apple to do the same.

But in 1979, when the Mountain Bell case occurred, “programming consisted of a technician — a single technician mind you — using a ‘teltypewriter’ and the entire process “t[ook] less than one minute” ~ Apple Pleadings

Here is an image of the type of device that law enforcement asked Mountain Bell to “program” in 1979:

Pasted Graphic 2

So yeah, not the same as asking a company to create a tailored operating system.

Proprietary

In the current case, we’re talking about messing around with someone’s proprietary intellectual property. To my knowledge that has never occurred under the All Writs Act before. Ever.

Let me repeat that: The Courts have never required a third-party to alter — more less degrade — their proprietary property in order to aid law enforcement.

Screwing around with someone’s proprietary property is not like asking them for security footage. It’s more akin to asking an author to rewrite portions of their book and then put that book up for sale under the author’s name. Similarly, what the FBI wants Apple to do is to rewrite portions of their security software and then put it up for sale under Apple’s imprimatur.

Unprecedented

In fact, the requested action by the FBI is unprecedented at every level. Never before has law enforcement asked that such a burden be imposed under the All Writs Act.

For A Living

Proprietary? Unprecedented? The FBI shrugs these off, blithely responding ‘fiddily dee dee, writing a little software is not a burden for Apple. After all, they write software for a living.’

(I)t is not an unreasonable burden for a company that writes software code as part of its regular business.

(T)his case requires Apple to provide modified software, modifying an operating system—writing software code—is not an unreasonable burden for a company that writes software code as part of its regular business.

Oh yeah? As Apple pointed out in their pleadings, following the government’s thinking to its logical conclusion leads to absurd results.

(I)t would not be unreasonably burdensome to demand that Boeing build a custom jet for the government because Boeing builds planes as part of its regular business or to demand that a pharmaceutical company make drugs for executions after it has made the intentional decision not to. ~ Apple Pleadings

Just because Apple is in the business of building encryption software does not mean Apple is in the business of tearing down their encryption software any more than Boeing is in the business of building planes that are specially designed to crash.

Costs

The compromised operating system that the government demands would require significant resources and effort to develop. Although it is difficult to estimate, because it has never been done before, the design, creation, validation, and deployment of the software likely would necessitate six to ten Apple engineers and employees dedicating a very substantial portion of their time for a minimum of two weeks, and likely as many as four weeks. Members of the team would include engineers from Apple’s core operating system group, a quality assurance engineer, a project manager, and either a document writer or a tool writer. ~ Apple Pleadings

The costs for Apple to accede to the government’s request are extraordinary and unprecedented…

…but no one cares. Apple makes billions of dollars, so no one has any sympathy for them.

There are, of course, other, more long term, and more damaging, costs such as forcing Apple to violate their existing representations, and the harm that would be caused to Apple’s reputation, their global brand and their bottom line.

Again, hardly anyone outside of Apple cares about those costs either. “Whoop de doo,” they say. “Price of doing business.”

Except, of course, that’s dead wrong. Hurting your brand and your reputation and your product are NOT the price of doing business. On the contrary, they’re the penalty paid for doing your business very, very badly. And Law enforcement simply does not — at least not without statutory authority — have the power to command you to run your business badly.

It’s important to understand that the costs described above are just the beginning, not the end, of the burdens that the FBI’s request would place upon Apple. The truly oppressive costs would come in forms that few have adequately considered.

Forensic

Standard forensic practice would require the code of any forensic tool Apple produces to be preserved for at least as long as it might be needed as evidence in court. We’re talking about years, and, with appeals, perhaps decades. So just forget about the idea of creating, then destroying, a software skeleton key as quickly as you made it. That’s out.

And, of course, the Apple engineers would have to testify at trial about the back door that they had created, otherwise, under the fourth amendment, the evidence would be inadmissible.

Essentially, the encryption breaking portion of Apple would become a permanent arm of the government’s forensic team.

This, of course, is materially different from merely asking a store owner to provide law enforcement with a copy of their security footage.

Pariahs

Apple would have to maintain an in-house team of engineers dedicated to hacking its own users and affirmatively undermining the company’s promised security measures. The engineers involved in this effort might be some of the very same engineers responsible for designing and building the security features in the first place. Can you imagine what an awkward position that would place them in? Everyone else at Apple would view them as the as saboteurs. They would be treated like pariahs and their job would make their lives a living hell.

A House Divided

It is extremely difficult to write bug-free code.

There are two ways to write error-free programs; only the third one works. ~ Alan Perlis

The suggestion that Apple would be able to program a back door without risking a major screwup is laughable. I mean, have you even MET programing?

QUESTION: Joe’s code has 20 bugs. If Joe fixes 2 bugs per hour for 8 hours, how many bugs does Joe’s code have now?

ANSWER: 27.

[pullquote]Improving security would be costly and dangerous[/pullquote]

Software bugs can interact with existing code in complex ways, creating unanticipated new paths for bypassing iPhone security and exploiting the phone. Purposefully creating vulnerabilities likely creates even more vulnerabilities and those can be pretty dangerous. That means every design choice Apple makes to improve device security entails, not only the foreseeable front-end costs of implementing it, but the unpredictable back-end costs of degrading that improved security. And that’s especially true in this situation, where Apple would have to create the code entirely by itself, and without the possibility of any outside security audit.

[pullquote]The smart thing would be to stop improving your encryption[/pullquote]

Have you considered the contradictory incentives that would create? What is the point of making your encryption better if you know that you are simultaneously required to break that encryption? You’re just making your life — and life of your co-workers — harder. Instead of simply asking whether new security measures are cost-effective to implement from a user’s perspective, engineers would need to evaluate whether they could justify the additional cost of being required to attack those measures too.

The only way to avoid unnecessary costs, unnecessary work, unnecessary danger, and unnecessary conflicts with co-workers would be to stop improving the encryption.

Morale

The effect on morale, for both the engineers and the company overall, would be devastating.

What Apple engineer is going to want to destroy the company’s encryption and make things worse for their customers? Is that even ethical? There’s already been talk of Apple engineers refusing to comply with such an order or resigning their positions.

And how is Apple supposed to keep up overall morale when employees all know that one part of the company is actively sabotaging the other, and all in order to make their product worse and to make their customers less safe?

Safeguarding

As if it’s not enough that the government is forcing Apple to create govtOS, they’re also making Apple responsible for safeguarding it too.

Apple is being forced to make a nuclear weapon, then either take responsibility for guarding that weapon or destroy it and rebuild it later. ~ Jonathan Ździarski on Twitter

The FBI frames this burden as a favor since they’re “allowing” Apple to decide for itself whether they wish to share the requested software skeleton key or keep it in the safety of their own secure headquarters.

The Court’s Order is modest. It applies to a single iPhone, and it allows Apple to decide the least burdensome means of complying. ~ FBI Pleadings

Oh, thanks a bunch, FBI.

When the FBI says that it is “allowing” Apple to decide the least burdensome means of complying with their request, what they’re really saying is that they’re foisting the responsibility of solving this impossible task onto Apple. To paraphrase Pyrrhus (he, of the pyrrhic victory), if the FBI does Apple another such favor, Apple is ruined.

Apple v FBI debate remind anyone of Jurassic Park? “We want you to create mutant dinosaurs, but only for safe captivity on this one island.” ~ Jon Fortt on Twitter

When asked, during a Congressional hearing, about whether it would be difficult to safeguard govtOS, FBI Director Comey testified that he had “a lot of faith” that Apple could protect the code from falling into the wrong hands. How oh so very convenient for Comey and the FBI and how oh so very inconvenient for Apple — who has to do all the work and endure all the risk as well.

Once you’ve created code that’s potentially compromising, it’s like a bacteriological weapon. You’re always afraid of it getting out of the lab. ~ Michael Chertoff, co-author of the Patriot Act, US Secretary for Homeland Security under George W. Bush

The government says, ‘Hey, security is no big deal’.

(T)here is no reason to think that the code Apple writes in compliance with the Order will ever leave Apple’s possession. ~ FBI Pleadings

There is, in fact, EVERY reason to think that the requested code will leave Apple’s possession.

Apple would end up being responsible for the Hope diamond of security keys. ((Now that I think about it, the value of the Hope Diamond pales in comparison to the value of breaching Apple’s encryption.))

Hope_Diamond
The code would be a major prize and actors would go to almost any length — including kidnapping — to obtain it.

It makes Apple employees targets of foreign governments, kidnappings, hacking, surveillance, blackmail, etc. ~ Jonathan Ździarski on Twitter

And who would these attackers be? The baddest of the bad. Hackers, cybercriminals, authoritarian governments such as China and Russia. Some of the best minds — with some of the worst intentions — would bend their efforts toward obtaining this newly created skeleton key.

[I]t may simply be impossible to keep the program from falling into the wrong hands. ~ NSA expert Will Ackerly

And since Apple would have to maintain each key, and since Apple would have to create and re-create the key thousands upon thousands of times a year, Apple’s burden would be constant and never-ending.

We strongly believe the only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it. ~ Apple Pleadings

A Skeleton Key

The government is also very mistaken in their claim that the crippled iOS it wants Apple to build can only be used on one iPhone.

The technical experts have warned us that it is impossible to intentionally introduce flaws into secure products—often called backdoors—that only law enforcement can exploit to the exclusion of terrorists and cyber criminals. ~ Congressman John Conyers

Once GovtOS is created, personalizing it to a new device becomes a simple process. If Apple were forced to create GovtOS for installation on the device at issue in this case, it would likely take only minutes for Apple, or a malicious actor with sufficient access, to perform the necessary engineering work to install it on another device of the same model. ~ Apple Pleadings

A signed firmware update that is not truly limited to a single device, even one created for legitimate forensic purposes, becomes like a ‘skeleton key’ for the entire class of devices. Moreover, the more often this tool is used, the greater the risk it will be stolen or otherwise disclosed. ~ Apple Pleadings

Apple wouldn’t be creating a single key to open a single lock. They would be creating a skeleton key that would be capable of opening a billion locks.

Just in case you didn’t get the irony, FBI now has a backdoor that isn’t restricted to a single device, like they insisted Apple could make. ~ Jonathan Ździarski (@JZdziarski)

A Footprint

As if it’s not bad enough that the requested skeleton key is likely to be stolen, security experts say that the mere act of creating the software would put at risk the privacy and integrity of the data stored on millions of iPhones worldwide.

[u]sing the software even once could give authorities or outsiders new clues to how Apple’s security features work, potentially exposing vulnerabilities that could be exploited in the future. ~ Brandon Bailey

[T]here is no way to make a backdoor that works only for this single phone—the process of creating the backdoor establishes a blueprint and workflow for compromising all iPhones. ~ Kalev Leetaru, Forbes Contributor

The danger is not whether the FBI submits one request or a thousand, it’s forcing Apple to create the tool. ~ Bruce Schneier, security technologist at Harvard University’s Berkman Center for Internet and Society

We’re not just talking iPhones here. Security experts fear that if Apple is forced to create a “key” to access one of the San Bernardino terrorists’ iPhones, then that technology will leave a “footprint” that cannot be erased. And that ‘footprint’ could provide a hacker with a path for attacking not just Apple, but the encryption of others as well.

My definition of an expert in any field is a person who knows enough about what’s really going on to be scared. ~ P. J. Plauger

We’ve come a long, long way from law enforcement merely asking a store owner for some security footage, right?

Does Not Exist

Another aspect of this case that hasn’t been receiving enough attention is the fact that the FBI is asking Apple to create something that does not exist. In all the government’s past requests for citizen assistance, never — NEVER — has the government asked someone to create something that didn’t already exist.

[pullquote]How exactly do you compel creativity and ingenuity?[/pullquote]

And we’re not talking about building a new chair or making a new dress either. We’re talking about writing code — something that’s not easy to create. Setting aside the First Amendment issues — which deserve an article of their own — how exactly do you compel creativity and ingenuity?

Apple is being asked to use their very best engineers — you know, the ones who were supposed to be making their encryption harder to break — to use their creative juices in an effort to break and degrade their encryption.

FBI is not only ordering Apple to perform surgery, they’re ordering them to invent a new medical procedure, and with no medical training. ~ Jonathan Ździarski on Twitter

Actually, it’s even more sinister than the above analogy implies. The FBI is ordering Apple to invent a new medical procedure that would undo prior surgical repairs and do their patient harm.

Apple Has A Compelling Interest Not To Comply

JUST MARKETING

“Ah, so what,” says the government, “This is all Apple’s fault anyway. They brought this on themselves.”

This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that the government cannot search them, even with a warrant. ~ FBI Pleadings

Let’s just set aside for the moment that what Apple is doing is one hundred percent legal (and, according to CALEA, what the government is attempting to do is one hundred percent illegal). Apple’s refusal to dilute their encryption is a “marketing decision”…

…in roughly the same sense that not serving burgers garnished with sewage is a “marketing decision”. ~ Julian Sanchez on Twitter

KEY EMPLOYEES

Have you given any thought to which engineers Apple would have to use in order to break their own encryption? You should.

Apple has maybe 5 employees capable of writing the software. Doing this means not fixing some other vital bug. ~ Rob Graham ❄️ on Twitter

The FBI’s request would not just turn Apple’s best minds toward the task of breaking Apple’s encryption, it would also divert those self-same minds away from the all important task of making Apple’s encryption better.

DISCOURAGE CUSTOMER UPDATES

If Apple can be forced to use their automatic updates to remove security features, it creates an incentive for customers to not update their devices. It’s in Apple’s best interests that customers update their operating systems as soon as possible and customers also benefit, not just from the features provided in updates but by security enhancements as well. The disincentive created by the FBI’s intrusion into Apple’s software update procedure would make the operating system open to even more security vulnerabilities.

SAFEGUARD CLIENT DATA

Apple is not just petulantly refusing to honor the FBI’s request out of childish spite or due to a lack of patriotic fervor. Apple has a compelling interest in safeguarding the data protection systems that ensure the security of hundreds of millions of customers who depend upon, and store their most confidential data on, their iPhones. An order compelling Apple to create software that defeats those safeguards undeniably threatens those systems and adversely affects both Apple’s interests and the interests of iPhone users around the globe. The protections that the government asks Apple to compromise are the most security-critical software component of the iPhone—any vulnerability or back door, whether introduced intentionally or unintentionally, can represent a risk to all users of Apple devices simultaneously.

Apple is being asked to build a cruise ship that will flood just one customer’s compartment, without making the ship any less seaworthy.

In essence, the FBI is demanding that Apple re-write its own software code, degrade the security of their customers, and create potentially catastrophic risks to the security of users’ Apple devices. How is that not going to be construed as unduly burdensome?

NORMAL COURSE OF BUSINESS

In every other case where the Courts have compelled a company to assist law enforcement, they justified it by pointing out that the request did not require the company to do anything other than what it was already doing in its normal course of business anyway. For example, law enforcement can request security footage from a store because the store, in the normal course of business, installed a security camera and took, and kept, security footage. The burden of providing a copy to law enforcement is minimal.

That is not the case here. Not only is the FBI asking Apple to do something they would not do in the normal course of business, they’re asking Apple undo what they normally do.

ANTITHETICAL

Apple is being asked to take an action that is not only costly, not only NOT in the normal course of business, not only dangerous, but something that is antithetical to their business and something that is plainly “offensive to it.” N.Y. Tel. Co., 434 U.S. at 174.34

Apple is not required to sabotage its own products. On the contrary:

(Apple is) free to choose to promote its customers’ interest in privacy over the competing interest of law enforcement. ~ Magistrate Judge Ornstein

Buzzwords Redux

Let’s look at that list of legal buzzwords again:

Must not be “in any way burdensome”; “meager assistance”; “minimal effort”; “no costs will be incurred”; “require minimal uses of company resources”; “no disruption to its operations”; the absence of any conceivable “adverse effect”; “normal course of their business”; “must not adversely affect the basic interests of the third party or impose an undue burden”.

After re-reading the above, do you really think there is any reasonable way to construe the government’s request of Apple as anything but burdensome?

Third Party

One final time, I feel I need to re-remind everyone that Apple is not the bad guy here. Apple did nothing wrong. They didn’t break any laws. This is about law enforcement asking a third party — who is not engaged in any wrongful conduct — to not just take an action, but to take an action they don’t want to take and one that would be harmful to them.

The government is asking Apple to do them a favor, but what a favor. The FBI asking Apple to trash it’s own encryption is like your neighbor asking you to burn down your house so he can stay warm.

[pullquote]Consider the effect on small businesses[/pullquote]

And have you considered the effect the FBI’s request would have on companies not named Apple?

The government is desperately trying to maintain the ludicrous fiction that this is about one phone, one time, because it doesn’t want the Court to think about the very real world consequences of what would happen when their requests were made to not just Apple, but to all companies, everywhere, all the time. Unlike Apple, few companies have the resources necessary to comply with such requests and even fewer have the resources necessary to resist such requests. ((Two examples are Hush Mail (a Canadian company) and Lavibit. Both companies were devastated, and essentially forced to close their doors, simply because they could not afford to fend off government requests for their client’s data.))

The government’s requests would chill innovation and deter companies from entering the important field of encryption.

Conclusion

An FBI that asks Apple to break their encryption for the greater good is like a cannibal that asks a chef to teach him how to cook so he may better serve mankind. Sounds noble, but it’s just going to get us all in hot water.

The FBI is attempting to compel Apple to reengineer a product design solely to defeat the product’s purpose. Asking Apple to create that which does not already exist and which Apple does not want to create and which will harm the company now, and going forward, is the very definition of burdensome.

What the government is demanding of Apple is simply above and beyond what can be, and should be, demanded of a good citizen.