Why Apple doesn't want FBI to hack San Bernardino shooter's iPhone

Apple CEO Tim Cook says the FBI wants to build a backdoor into all iPhones. 

Apple CEO Tim Cook responds to a question during a news conference in New York in April 2015. Cook said Wednesday his company will resist a federal magistrate's order to hack its own users in connection with the investigation of the San Bernardino, Calif., shootings.

Richard Drew/AP/File

February 17, 2016

Apple Inc. CEO Tim Cook says his company will fight a federal magistrate's order to hack its users in connection with the investigation of the San Bernardino shootings, asserting that would undermine encryption by creating a backdoor that could potentially be used on other future devices.

Cook's ferocious response, posted early Wednesday on the company's website, came after an order from U.S. Magistrate Judge Sheri Pym that Apple Inc. help the Obama administration break into an encrypted iPhone belonging to one of the shooters in the December attack.

The first-of-its-kind ruling was a significant victory for the Justice Department in a technology policy debate that pits digital privacy against national security interests.

What Trump’s historic victory says about America

Noting the order from federal Magistrate Judge Sheri Pym in California, Cook said "this moment calls for public discussion, and we want our customers and people around the country to understand what is at stake." He argued that the order "has implications far beyond the legal case at hand."

Pym's order to Apple to help the FBI hack into an encrypted iPhone belonging Syed Farook, one of the San Bernardino, California shooters, set the stage for a legal fight between the federal government and Silicon Valley. The ruling by Pym, a former federal prosecutor, requires Apple to supply highly specialized software the FBI can load onto the county-owned work iPhone to bypass a self-destruct feature, which erases the phone's data after too many unsuccessful attempts to unlock it. The FBI wants to be able to try different combinations in rapid sequence until it finds the right one.

The order represents a significant victory for the Justice Department and the Obama administration, which has embraced stronger encryption as a way to keep consumers safe on the Internet but has struggled to find a compelling example to make its case.

Federal prosecutors told the judge in a court application that they can't access a work phone used by Farook because they don't know his passcode and Apple has not cooperated. Under U.S. law, a work phone is generally the property of a person's employer. The magistrate judge told Apple in Tuesday's proceeding to provide an estimate of its cost to comply with her order, suggesting that the government will be expected to pay for the work.

Apple has provided default encryption on its iPhones since 2014, allowing any device's contents to be accessed only by the user who knows the phone's passcode.

Democrats begin soul-searching – and finger-pointing – after devastating loss

The order requires that the software Apple provides be programmed to work only on Farook's phone, but it was not clear how readily that safeguard could be circumvented. The order said Apple has five days to notify the court if it believes the ruling is unreasonably burdensome.

It also was not immediately clear what investigators believe they might find on Farook's work phone or why the information would not be available from third-party service providers, such as Google or Facebook, though investigators think the device may hold clues about whom the couple communicated with and where they may have traveled.

The phone was running the newest version of Apple's iPhone operating system. San Bernardino County provided Farook with an iPhone configured to erase data after 10 consecutive unsuccessful unlocking attempts. The FBI said that feature appeared to be active on Farook's iPhone as of the last time he performed a backup.

In his website posting, Cook said the U.S. government order would undermine encryption by using specialized software to create an essential back door that he compared to a "master key, capable of opening hundreds of millions of locks."

"In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession," Cook wrote. "The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a back door. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."

The ruling tied the problem to the San Bernardino attack, the deadliest by extremist elements on U.S. soil since the 2001 attacks on the World Trade Center and the Pentagon. Syed Farook and his wife, Tashfeen Malik, killed 14 people in a Dec. 2 shooting at a holiday luncheon for Farook's co-workers. The couple later died in a gun battle with police.

The ruling by Pym, a former federal prosecutor, requires Apple to supply highly specialized software the FBI can load onto the county-owned work iPhone to bypass a self-destruct feature, which erases the phone's data after too many unsuccessful attempts to unlock it. The FBI wants to be able to try different combinations in rapid sequence until it finds the right one.

It was not immediately clear what investigators believe they might find on Farook's work phone or why the information would not be available from third-party service providers, such as Google or Facebook, though investigators think the device may hold clues about whom the couple communicated with and where they may have traveled.

The couple took pains to physically destroy two personally owned cell phones, crushing them beyond the FBI's ability to recover information from them. They also removed a hard drive from their computer; it has not been found despite investigators diving for days for potential electronic evidence in a nearby lake.

Farook was not carrying his work iPhone during the attack. It was discovered after a subsequent search. It was not known whether Farook forgot about the iPhone or did not care whether investigators found it.

The judge didn't spell out her rationale in her three-page order, but the ruling comes amid a similar case in the U.S. District Court for the Eastern District of New York.

Investigators are still working to piece together a missing 18 minutes in Farook and Malik's timeline from Dec. 2. Investigators have concluded they were at least partly inspired by the Islamic State group; Malik's Facebook page included a note pledging allegiance to the group's leader around the time of the attack.

In 2014, Apple updated its iPhone operating system to require that the phone be locked by a passcode that only the user knows. Previously, the company could use an extraction tool that would physically plug into the phone and allow it to respond to search warrant requests from the government.

Here's the full text of the Apple CEO's message:

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand. 

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

The Need for Encryption

Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The Threat to Data Security

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

A Dangerous Precedent

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim Cook