Influencers: Apple should not help FBI crack San Bernardino iPhone
Loading...
A strong majority of Passcode Influencers said that Apple should not comply with a US court’s order to help the FBI get into the San Bernardino shooter’s iPhone.
“Apple should use every resource in its tool chest to fight against a government-mandated backdoor that makes us all far less safe,” said Sascha Meinrath, director of the X-Lab, a tech policy think tank. “Today’s battle is not about San Bernardino, it’s about the integrity of our information and communications and our fundamental right to privacy.”
A magistrate judge in Riverside, Calif., on Tuesday ordered Apple to disable built-in security features on the iPhone of the one of the Islamic State-inspired shooters who killed 14 people in December, which could allow the FBI to run a program that would crack the phone’s password faster. Apple CEO Tim Cook vowed to fight the court order, saying in a letter to Apple customers that creating new, weaker software designed to get around the security features is tantamount to building a backdoor. That is simply “too dangerous to create,” he warned, because it could be used not just in this case but to break into Apple products again and again.
As the world’s largest tech company goes head-to-head with the US government, the battle promises to be an important test case in the simmering encryption debate. And 60 percent of Passcode’s pool of more than 130 digital security and privacy experts who took the survey sided with Apple.
Kevin Bankston, director of New America’s Open Technology Institute, worries that if Apple helps the FBI in this case, it will set a dangerous legal precedent. “This case isn’t about about just one backdoor into just one iPhone,” Mr. Bankston said. “If Apple builds this backdoor software, our government – and less savory governments around the world – will demand to use this new software every time they encounter an encrypted iPhone.”
The precedent would not stop at just Apple and its iPhones, either, Bankston said. “If a court can legally compel Apple to build this backdoor, then it also likely could also compel any other software provider to do the same, including compelling the secret installation of surveillance software via automatic updates to your phone or laptop,” he said. “Such a broad precedent would spell digital disaster for the trustworthiness of any and every computer and mobile device… Apple is right to fight this. A loss would not only undermine its product but its entire industry and the security of every user of digital technology.”
Many Influencers who argued Apple should fight the order said they were sympathetic to the FBI’s plight and said that increasingly pervasive encryption and stronger security measures do make investigators’ jobs harder. They also acknowledge that this stance may be especially controversial in a sensitive case such as this one. But the court’s solution in this case, Influencers said, would put the American public’s security at risk in the long term.
“Law enforcement has legitimate concerns about access to information. But this short term fix hurts our cybersecurity long term,” said Jenny Durkan, chair of the Cyber Law and Privacy Group at Quinn Emanuel law firm. “It is a dangerous precedent to order a company to purposely breach a product’s security. When that product is as important and relevant as the iPhone, the security and privacy risks are too great. The sad fact is that right now we are incapable of protecting our most vital information. Purposely creating more flaws is short-sighted.”
The security of Internet-connected devices, added Christian Dawson, cofounder of the Internet Infrastructure Coalition (i2Coalition), which comprises leading industry infrastructure providers and technology firms, “is vitally important to the economy, and the millions of consumers who rely upon being able to keep their personal data, financial, medical and legal records, secure.”
“Requiring workarounds that weaken encryption could have broad ramifications and the potential to jeopardize the security and safety of all users,” Mr. Dawson stresses. While Internet infrastructure companies work with law enforcement agencies every day to combat and deter crime, the industry “cannot support a government mandate to weaken security standards.”
A 40 percent minority of Passcode Influencers said Apple should help the FBI because there is a court order. While companies such as Apple have a right to design products that protect their customers, they said, that right does not extend beyond the reach of law enforcement.
“Society relies on civil support to law enforcement. Privacy does not extend to hiding evidence of a crime. This is a big question of our industry today: Is it legitimate to create items with features designed to hide and/or destroy evidence?” said one Influencer, who preferred to remain anonymous. Influencers have the option to reply on-record or anonymously to preserve the candor of their responses.
“The argument that we must build systems to prevent crime is legitimate,” the Influencer continued. “But enabling other crimes while trying to prevent others is not a legitimate trade.”
Another Influencer who thought Apple should comply with the order said this will be an important test for how companies behave under the rule of American law. “All of the slippery slope, ‘What happens when the Chinese try to do the same,’ and similar arguments are nice rhetorical lines with zero basis in our Constitutional traditions,” the Influencer said.
“Of course this puts Apple in a tricky spot, in the same way, say, other industries have had to – consistent with a court order and hence compelling public purpose – also disclose their customers’ data. Or the same way certain companies don’t and won’t do business in places like China or Russia. Apple has a clear choice: act consistent with the rule of law of the United States or create its own, pan-international law that is somehow unhinged from our legal and Constitutional traditions.”
Other Influencers say they support strong encryption and other security measures – and worry that Apple’s refusal to comply will ultimately hurt the battle for consumer security. “This is the wrong issue for Apple to fall on their sword over,” one Influencer said. “It’s a hack that would only impact an outdated device they no longer sell, enabling them to mitigate impact over a broader backdoor. They run the risk that this stance backfires and leads to broader restrictions on encryption.”
While many Influencers said Apple seems to be on the right side of the encryption issue, a highly-politicized election year might mean the government’s arguments against terrorism trump arguments for encryption.
“Apple would like this to be an issue about technology, encryption, and customer trust,” an Influencer said. “In an election year, they will not be able to sustain that argument. The campaigns will make it an issue about terrorism and public safety. Apple is right on the encryption debate. But that’s not the battleground on which this fight will take place.”
What do you think? VOTE in the readers’ version of the Passcode Influencers Poll.
COMMENTS:
NO
“This is a perfect test case for the government concept of forcing companies to provide backdoors into their security, because it involves a terror plot and it lacks exigency. That gives courts and policymakers ample time to fully examine this question. In my opinion, the FBI’s probable cause that relevant evidence will be found on the phone is pretty thin, and the consequences are dire: there is no such thing as a one-time back-door, nor is there such a thing as a backdoor that can be used only for good.” - Nick Selby, StreetCred Software
“Vendors should and will help law enforcement when court orders are obtained for specific access. However, the FBI is asking Apple to provide a compromised version that could subvert security in any iPhone, a very different story. The courts need to rule on this first.” - John Pescatore, SANS Institute
“Unfortunately, if Apple were to acquiesce to this demand, there is no limit to the sort of similar demands they are likely to receive from other courts and countries. Everybody thinks these kinds of lengths are demanded by the facts of their case.” - Tom Cross, Drawbridge Networks
“18th Century laws (the All Writs Act) are out of step with 21st century technology.” - Marc Rotenberg, Electronic Privacy Information Center
“Not only do I not think Apple should comply but I also believe those arguing for Apple to comply, such as the White House spokesman, have been misleading the public. As an example, the White House spokesman stated that Apple could do this simply and in a limited way. There is nothing simple or limited about this request. Policymakers and law enforcement officials may want something, it may be a good debate to have, but misleading the public on the technological challenges and repercussion does not benefit anyone.” - Robert Lee, Dragos Security
“This case highlights one of the most crucial open questions in the cryptowars debate: In a rule of law country, what tools are reasonable for law enforcement? I don’t believe those tools should include mandatory backdoors or exceptional access systems for encrypted tech. But, on the facts of this case, is this an appropriately narrowly tailored request by the FBI? Or would it establish dangerous precedent allowing law enforcement agencies to compel technology companies to hack into customers’ devices? What happens when those devices include IoT products - like sensors and cameras in our homes, information from our cars, or information from wearables?” - Influencer
“Tim Cook seemed to capture the issue well when he said that Apple has been ‘asked for something that they don’t have, and is too dangerous to create.’ If the U.S. Government wants the authorities with which to mandate such an effort, passing a law to do it would be a reasonable first step.” - Bob Stratton, MACH37
“I agree with the Internet Association’s statement, which says that governments should not require the weakening of these necessary security standards. Efforts to weaken or undermine strong encryption harm consumers and undermine our national security.” - Abigail Slater, Internet Association
“Fight the power.” - Influencer
“Security and privacy are crucial to the future of trusted communications. Without trust, all networks fail. Apple and Google are correct in serving the public good by not creating backdoors or ‘master keys’ for anyone for any reason. The controversy being generated by this U.S. court order has allowed the debate to be taken to the people. The court order and its threat to civil liberties is of great benefit because it has made the debate more public, and millions here and abroad are learning about how and why encrypted communications are crucial as we create new ways of working in the digital age. Trust is everything.” - Influencer
“Privacy must trump terrorism. The rights of U.S. citizens have already been diluted due to our first ill-informed response to 9/11 called the Patriot Act. The government showed that it could not be trusted to apply those provisions narrowly and with great discretion. Now the U.S. government is making the same case that in order to stop terrorism the American people must once again surrender certain rights in order to be safe. I applaud Apple for not caving in to that argument and hope that my peers will join me in placing the need for privacy above making law enforcement’s job easier.” - Jeffrey Carr, Taia Global
“There are two issues here; one smaller and one larger. The smaller one is that law enforcement is asking Apple to hack their own phone; to write code that will essentially break it. It is analogous to law enforcement asking the Ford Motor Company to spend resources in order to put a permanent flaw in their latest truck engine. I don’t think that has ever been done before. The larger issue is this. Apple, and other Silicon Valley companies, decided a while ago that they did not want to hold a key to any backdoor that law enforcement might want to leverage in the future; not for obvious terrorist cases like this but for all citizens. They didn’t want to be the law enforcement enabler if and when law enforcement decided to skirt our privacy rights (see the first version of the Patriot Act). The data for these terrorists is not sitting in some database somewhere. It is sitting on the terrorist’s phone. In order to get it, you have to hack the phone. For this larger issue, you have to ask yourself if you want to grant law enforcement the ability to hack into anybody’s phone even if they have a warrant. And before you say yes too quickly, remember that once this flaw is installed, every bad guy on the planet will discover it and leverage it. The FBI, in my opinion, is leveraging our fear of terrorism to give them a back door; not for this terrorist case but for every case in the future. Apple’s decision, I think, is the smart play in the long run; for their customers for sure but for the nation.” - Rick Howard, Palo Alto Networks
“Creating a master iPhone key would have disastrous consequences for data security and US business interests overseas.” - Chris Finan, Manifold Security
“Apple should use every resource in its tool chest to fight against a government-mandated back-door that makes us all far less safe. Today’s battle is not about San Bernadino, it’s about the integrity of our information and communications and our fundamental right to privacy. Only the most myopic and technologically illiterate believe that fundamentally undermining encryption will magically make us safer -- the reality, one well understood by most top technologists, is that strong encryption (like computers themselves) is a huge boon to civil society and must remain safe and secure.” - Sascha Meinrath, X-Lab
“This court order demanding that Apple custom-build malware to undermine its own product’s security features, and then digitally sign that software so the iPhone will trust it as coming from Apple, doesn’t just set us down a slippery slope – it drops us off a cliff. This case isn’t about about just one backdoor into just one iPhone. If Apple builds this backdoor software, our government – and less savory governments around the world--will demand to use this new software every time they encounter an encrypted iPhone. But this isn’t just about iPhones, either: if a court can legally compel Apple to build this backdoor, then it also likely could also compel any other software provider to do the same, including compelling the secret installation of surveillance software via automatic updates to your phone or laptop. Such a broad precedent would spell digital disaster for the trustworthiness of any and every computer and mobile device. The FBI has already spent the last year arguing for backdoors in front of Congress and at the White House, and now that’s it’s come up empty it’s trying to get a lower court judge to convert a vague, centuries-old catch-all statute into a powerful government hacking statute. That’s not how we make policy in this country, and Apple is right to fight this—a loss would not only undermine its product but its entire industry and the security of every user of digital technology. A line must be drawn here, and we at OTI are eager to continue to fight to ensure that we can continue to trust the security and integrity of the devices we use every day.” - Kevin Bankston, Open Technology Institute
YES
“Not a good question. Of course they should comply – if they can’t get it reversed on appeal. Should they comply without legally challenging it? No.” - Influencer
“So long as it doesn’t force Apple to redesign its products, modify business processes, or taint future development. If it does, then Apple should fight to get clarity on just how far a court can mandate product features.” - Jeff Moss, DEF CON Communications
“This is the wrong issue for Apple to fall on their sword over. It’s a hack that would only impact an outdated device they no longer sell, enabling them to mitigate impact over a broader backdoor. They run the risk that this stance backfires and leads to broader restrictions on encryption.” - Influencer
“Apple has an obligation to assist the government if it can do so.” - Stewart Baker, Steptoe & Johnson
“They’re willing to harvest every drop of personal information from their users and sell it to the highest bidders via ad networks, but won’t let it be used in a real life/death law enforcement situation?” - Influencer
“The private sector should cooperate with lawful government investigations. No company should be forced to make design changes to the products and services it sells. But if companies have the ability to remotely update devices, then it is reasonable for law enforcement to ask them to use that capability to comply with court orders.” - Influencer
“This is different than a permanent backdoor in all devices.” - Influencer
“Apple would like this to be an issue about technology, encryption, and customer trust. In an election year, they will not be able to sustain that argument. The campaigns will make it an issue about terrorism and public safety. Apple is right on the encryption debate. But that’s not the battleground on which this fight will take place.” - Influencer
“There are ways to ensure that data is provided only from the terrorist’s phone without giving out Apple’s encryption secrets.” - Influencer
“According to Newsweek, Apple has unlocked their phones at least 70 times since 2008. Let’s say that again: APPLE HAS UNLOCKED THEIR PHONES AT LEAST 70 TIMES SINCE 2008!!! So why the big protest now? Because it’s a marketing ploy. The Constitution says there shall be no ‘unreasonable’ searches and seizures; it does NOT say there shall be NO searches and seizures, and it does not specify that companies get to decide what is reasonable. That is the purview of the courts, and in the immediate case, a US court has decided that it is in fact reasonable for Apple to provide the assistance necessary to open ONE SPECIFIC PHONE. So: Apple has done what it’s being asked to do scores of times previously; they are being presented with a legal, constitutional demand to produce information; there is a compelling legal and national security imperative at stake. So of course they should comply, and failing compliance, they should be subject to the full force of US Government sanctions and punishment.” -Influencer
“Bad facts make bad law - and Apple picked just about the worst facts to make their stand and they could end up doing a disservice to both the tech and the privacy communities. Thanks a lot, Apple.” - Influencer
“This is a cat and mouse game where the national security need is clear but industry needs to be ordered over great objection. The overarching issue is the global nature of markets and particular issues being addressed in local legal frameworks.” - Influencer
“Society relies on civil support to law enforcement. Privacy does not extend to hiding evidence of a crime. This is a big question of our industry today: Is it legitimate to create items with features designed to hide and/or destroy evidence. In an analogy, it is one thing to manufacture a knife. We don’t hold the manufacturer responsible for the actions of the holder. It is another thing to make a knife that deliberately destroys DNA evidence on its handle. Likewise, you can’t hide the knife in a commercial safe and refuse to open for the government with a warrant. The argument that we must build systems to prevent crime is legitimate. I have spent my career on this. But enabling other crimes while trying to prevent others is not a legitimate trade. Privacy advocates fear a precedent under which other countries where they do business will also ask for support with the intent of suppressing their citizens. This argument would seem to be an ethical one on first blush, even admirable. However, it is actually advocating for corporations to establish their own laws, ignoring the jurisdictions in which they do business. Will these corporations use their own internal courts to determine which laws they will respect? We don’t always agree, but we must work within the law.” - Influencer
What do you think? VOTE the readers’ version of the Passcode Influencers Poll.