Apple Inc. is already on record as opposing any government mandates that iOS devices be equipped with “back doors” through the company's communications and storage encryption algorithms.

But Tuesday, in response to a D.C. District Court order from Magistrate Judge Sheri Pym compelling Apple to help the FBI break through a certain iPhone’s storage encryption by brute force, the company issued an open response to customers, claiming that compliance would jeopardize the security of all iOS device owners.

“Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation,” the open letter reads, attributed to CEO Tim Cook. “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.”


Apple’s claim may be a bit of a stretch. The order, issued Tuesday, compels Apple to reasonably assist the FBI in defeating any auto-erase functions that might be present so that criminal investigators can attempt to break through the phone through the issuance of passcodes.

Third-party software could conceivably wipe the phone, for instance, in the event that a third or subsequent password attempt failed.

The order represents the hope that Apple can furnish the FBI with a piece of software explicitly crafted to clear the path for investigators to breach the password lock on one iPhone. This phone is said to belong to one of the perpetrators in the December 14 shooting in San Bernardino, which killed 14 people and injured 22.

According to Apple’s published legal guidelines [PDF], the company remains not only capable but willing to extract encrypted data for iOS devices running versions 4 through 7 of the operating system, or an earlier version. Version 4 was issued in the early fall of 2010.

“...Upon receipt of a valid search warrant issued upon a showing of probable cause,” the guidelines read, “Apple can extract certain categories of active data from passcode locked iOS devices. Specifically, the user generated active files on an iOS device that are contained in Apple’s native apps and for which the data is not encrypted using the passcode (‘user generated active files’), can be extracted and provided to law enforcement on external media.”

In such situations, Apple insisted on undertaking the process only at its Cupertino, Calif. headquarters.

The legal guidelines also specified the language that law enforcement officials should use in any warrants they present. That language would stipulate that Apple may provide law enforcement officials with copies of the encrypted data extracted from the device in question, in its encrypted form.

This is extremely important, because it suggests Apple, at the very least, abstains from contesting law enforcement’s right to try to decrypt files using its own means.

What Key, Where?

According to Apple’s guidelines, the only reason why Apple cannot decrypt data on devices running iOS 8 and newer is due to technical circumstance, not moral obligation.

“The files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess,” the guidelines read.

Tuesday’s District Court warrant does not compel Apple to decrypt the data on the San Bernardino device, which could be interpreted as in keeping with Apple’s own guidelines.

However, the warrant does go one step further, compelling Apple to provide whatever software alterations would be required — including specialized firmware updates — to disable any phone-wiping software that may be present.

Apple’s public response — which generated enough attention Wednesday to become an issue in the South Carolina presidential primaries — side-steps a critical new issue in the ongoing debate over law enforcement’s right to acquire data critical to an investigation: Can a manufacturer provide assistance to law enforcement in such a way that it enables those officials to crack encrypted files on devices without a back door, while still maintaining the appearance of protecting encryption?

In the response, CEO Cook argues that giving the FBI what it considers “a new version of the iPhone operating system” would, even if unintentionally, endow it with the means necessary to decrypt any device, not just one.

The warrant explicitly states that the software additions the FBI seeks from Apple “will not modify the iOS on the actual phone, the user data partition or system partition on the device’s flash memory.”

Cook’s response may be interpreted as an effort to deflect attention from a more directly pertinent issue.

Fra... gee... lay?

In what could be interpreted as a remarkable concession of the relative ineffectiveness of encryption as a protection tool, Cook argued, “the ‘key’ to an encrypted system ... is only as secure as the protections around it.” Indirectly, the statement suggests that defeating the phone-wipe function would be as effective as a back door itself.

If Cook were correct, then the entire issue of the impact of compelling Apple to help the FBI crack the phone, would be rendered moot. The FBI could, just as easily, seek a warrant from the maker of the security protections on the device — which may or may not be Apple — and achieve its stated purpose.

Either the encryption scheme on Apple’s newer iOS devices is as foolproof as its legal guidelines say it is, or as fragile as Tim Cook portrays it to be.

Assuming the former, then Apple could provide the FBI with any level of assistance imaginable, and after failing to achieve law enforcement’s objectives, end up proving its own point.

Yet if it’s the latter, the real issue is not whether a company has an obligation to hand over the decryption key to law enforcement whenever it asks, but rather whether it can comply with a directive from a law enforcement agency to help it break through using its own means.

Recall that Apple is on record as perfectly willing to supply law enforcement with a copy of encrypted files — letting them “have at it,” if you will — for older iOS devices.

Successfully providing such assistance for newer devices would be a tacit admission by the manufacturer that its encryption may not be as strong as it’s marketed to be.

Unless, of course, the whole operation is conducted in secret. The high-profile nature of the San Bernardino case rendered that option impossible, forcing Apple to either comply and lose face with its customers, or mount a PR campaign.

Such a campaign may yet be successful at deflecting public attention away from the technical efficiencies — or inefficiencies — of the encrypted devices the company makes.

The Magistrate’s order suggests a back door is not necessary — that there’s an easier way to crash through the front gate. Apple’s response ignores that suggestion.

In a statement issued Wednesday, the FBI Agents Association (not directly connected with the FBI) blamed clever marketing for this entire matter spinning out of control in the first place.

“Technology companies want to use concerns about identity theft to market unlockable devices, devices that allow individuals to communicate across borders with no ability for law enforcement to obtain information — even with a lawful warrant,” the FBIAA wrote. “Unfortunately, these marketing efforts, reinforced with intense lobbying of policymakers by the high-tech community, may be providing a safe haven for terrorist and criminal networks.”

For More Information:

  1. Looking Back: Still Not Serious About Online Security
  2. Security Today: Dynamic Access, Permissions, Encryption
  3. What’s Behind Google’s Encryption Moves?
Title image of the Apple Store entrance on 5th Avenue, New York City, courtesy Apple Inc.