The court battle between Apple and the Federal Bureau of Investigation has reignited the debate over privacy and surveillance, going far beyond a one-time request by law enforcement to unlock the iPhone used by the perpetrator the Dec. 2 mass shooting in San Bernardino, Calif., according to security and legal experts.
On Feb. 25, Apple asked the District Court of Central California to withdraw a previous order that would allow the FBI to demand that Apple create a special operating system that would remove key security restrictions from a phone taken from one of the deceased San Bernardino shooters.
At the heart of the court filing are the company’s arguments that the FBI is essentially conscripting a private entity to work for the U.S. government. Apple contends and that what FBI Director James Comey described as a “legal issue [that] is actually quite narrow” will become a flood of decryption requests that force Apple to create a de facto backdoor.
Noting that district attorneys in other states have stated that they have dozens of phones that they would like to decrypt, Apple’s lawyers argued that any court precedent will quickly be used to open the door to forcing the company to help the local, state and federal law enforcement agencies spy on its customers.
“It will only be a matter of days before some other prosecutor, in some other important case, before some other judge, seeks a similar order using this case as precedent,” the company stated in its brief. “Once the floodgates open, they cannot be closed, and the device security that Apple has worked so tirelessly to achieve will be unwound without so much as a congressional vote.”
Apple’s argument is the latest milestone in a case that is quickly becoming a rallying point for pro-privacy advocates as well as homeland defense advocates who support the government’s right to access any individual’s data under court order. Security and legal experts dismissed the Department of Justice’s arguments that the case is a narrow exception.
“The fear is that it sets a precedent,” Bruce Schneier, a well-known cryptography expert and chief technology officer for security firm Resilient Systems, told eWEEK earlier this week. “The fear is the next time it will happen with something that is less clear and less sympathetic. And the fear is that Apple has to do this several times a day.”
From the “Crypto War” of the 1990s to the massive intelligence leaks of former National Security Agency analyst Edward Snowden, history offers some key lessons, said security and legal experts.
1. Cost Will Turn ‘One-Time’ Access Into a Permanent Backdoor
FBI Director Comey and other law-enforcement officials have argued that their request is not a backdoor—a secret way to decrypt data. Comey argued in a column on Lawfare that what the agency is asking for is a limited case of turning off a security feature—the iPhone’s automatic delete feature if the passcode is guessed 10 times incorrectly—not a backdoor.
“The relief we seek is limited and its value increasingly obsolete because the technology continues to evolve,” he wrote. “We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That’s it.”
Yet, the argument is disingenuous at best, say security experts. The request will not be unique. As soon as Apple shows that they can allow access to the data on an iPhone, the FBI will start making requests on a regular basis. And, because Apple is a business, when faced with repeated requests for access to devices, the company will automate the process as much as possible to reduce costs. The technological resistance to requests—which, right now, is quite high—will be lower.
Apple Brief Claims FBI iPhone Order Endangers Constitutional Rights
“If the [FBI] succeeds, Apple and other companies that are subject to these sorts of orders will begin to get hundreds or thousands of them a year,” Alex Abdo, a staff attorney for the American Civil Liberties Union, said in a press call on Feb. 24. “Companies will have to create a compliance division … that division within these companies responsible for delivering malicious updates to their users is going to become the technological backdoor.”
Apple has argued the issue since it initially publicized the court order on Feb. 16.
“Up to this point, we have done everything that is both within our power and within the law to help [the FBI],” CEO Tim Cook said in his Feb. 16 letter to customers. “But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
2. The FBI Never Historically Had These Powers
A central argument of Apple’s Feb. 25 motion to vacate the court order is that, by using a 1789 law, the U.S. Department of Justice is attempting to unconstitutionally force companies to help them.
“This case is about the Department of Justice and the FBI seeking through the courts a dangerous power that Congress and the American people have withheld: the ability to force companies like Apple to undermine the basic security and privacy interests of hundreds of millions of individuals around the globe,” the company argued.
Resilient Systems’ Schneier agreed that the sorts of surveillance powers—and the ability to recover information about people’s activities—have not existed to date.
“Historically, all the conversations disappeared forever and there was no access,” Schneier said. “The conversations you had would never be written down to be later read.”
The data privacy problems are only going to get worse. In a recent paper published through Harvard University, Schneier and other security and legal experts warned that the burgeoning Internet of things will mean that every aspect of our lives will be recorded.
“The still images, video, and audio captured by these devices may enable real-time intercept and recording with after-the-fact access,” the group wrote. “Thus, an inability to monitor an encrypted channel could be mitigated by the ability to monitor from afar a person through a different channel.”
Metadata, which remains unencrypted, provides a great deal of information already—whom the suspect contacted through email, chat or phone calls and the rough location of the person during the day. “This information provides an enormous amount of surveillance data that was unavailable before these systems became widespread,” the group stated.
3. Products With Backdoors Do Poorly in the Market
In the 1990s, when the U.S. government forced browser makers and other technology companies to limit the encryption strength in their products, U.S. technology companies were hurt. Firms in Europe and Israel quickly filled the vacuum with software that could use strong encryption.
Both Apple’s legal brief and security experts have asked to what extent does the FBI have the legal power to order a company to research and develop exploits to their own products.
“Could the government compel a Google security engineer to spend Tuesdays at Ft. Meade [NSA headquarters] to break security features that they coded on Monday?” the ACLU’s Abdo asked. “These are the legal questions that the government’s position raises and that the government has yet to grapple with.”
Even the FBI acknowledged the new world we are entering and the novelty of its investigative techniques.
The case “does highlight that we have awesome new technology that creates a serious tension between two values we all treasure: privacy and safety,” Comey said.