Apple emphasizes users' privacy amid its parade of updates

At its annual developer conference in San Francisco this week, privacy-enhancing technologies underpin many of the software updates that Apple announced to its operating systems.

Craig Federighi, Apple's senior vice president of software engineering, talked about photos within iOS at the company's World Wide Developers Conference in San Francisco on Monday.

Stephen Lam/Reuters

June 14, 2016

As Apple executives unveiled a slew of updates – an improved music service and new mobile operating systems among them – at its developer conference in San Francisco, the company that recently tangled with government officials over iPhone security reinforced its commitment to advancing users' privacy.

“We believe you should have great features and great privacy,” said Craig Federighi, Apple's senior vice president of software engineering, on Monday. “You demand it, and we are dedicated to providing it.”

This year's conference is Apple's first since the legal battle between the FBI and the tech giant over access to an iPhone used by the gunman in last year's San Bernardino terrorist attack. At the time, Apple said it refused to help the FBI unlock the phone because it would ultimately damage security of all its users, making the case that it would also set a chilling precedent that would further erode privacy in the Digital Age.

Now, Apple is putting user privacy at the center of many of its updates in an attempt to make that a major selling point and a way of differentiating it from competitors Facebook and Google, whose business models largely rely on collecting users' data. 

Consider the company’s plan to improve the capabilities of its photo management software by teaching it to recognize faces or collect images taken during a vacation, for instance.

While Facebook has offered facial recognition for years and Google is often praised for its ability to hone in on the most valuable photos after it analyzes users’ image collections, Apple’s offering differs in that all of the artificial intelligence that powers this feature happens on the device, rather than on a server controlled by someone else.

This means its customers will be able to take advantage of a smarter photo app without having to trust remote servers for analyzing the data points embedded in individual photos. 

Improvements to the iPhone’s oft-mocked autocorrect feature – which tries to predict what you type but is often laughably inaccurate – is another example. While it may be a step behind what Google plans for Allo, a messaging app that will recommend responses to messages based on what was said in the past, Apple’s solution prioritizes users' privacy.

Can Syria heal? For many, Step 1 is learning the difficult truth.

Apple is relying on a concept known as differential privacy to enable the autocorrect update, which Apple is calling QuickType. The idea is that Apple will be able to use data gleaned from its customers to make its products smarter without requiring it to access anyone’s personally identifiable information. Differential privacy is an established concept, but when Apple introduces its implementation with iOS 10 it will become the largest public employer of the technique. 

“Differential privacy is a mathematical definition of privacy that uses statistics to obscure individual data points. [Using] this would allow Apple to collect data, like all the words you type and in which order, in a form that doesn't end up with you just sending all of your chat logs to Apple,” says Kyle Lady, senior research and development engineer at the cybersecurity firm Duo Security. 

“This approach can be used to strike a balance between mass surveillance over everything you ever type and the current model, where Apple ships a prediction engine that learns from just what you type, which isn't really that much, compared to all the words everyone else is typing," he says.

Some of this may sound too good to be true, especially given Apple’s previous struggle to implement cryptography in products that are supposed to be secure, such as its popular iMessage service. That issue has since been fixed, but it’s a blemish on Apple’s claim to offer secure products that offers a sense of how wide the gulf is between the company’s idealism and its actual capabilities.

“I trust that they are not actively making bad security and privacy decisions, but doing all of this correctly is hard indeed,” says Mr. Lady of Duo Security. “The iMessage flaw was discovered by researchers at Johns Hopkins, who worked on the problem from a position of having no base assumptions, thus revealing [flaws that were] hidden by assumptions and oversights on the part of Apple.”

These are just the latest of Apple’s efforts to bolster the security of its products. The company has made several key hires, including Jon Callas and Frederic Jacobs – both noted technologists and entrepreneurs behind several privacy-focused startups – to expand its security team.

It’s also said to be weighing options for making iCloud – which doesn’t encrypt data in a way that prevents Apple from accessing it – more secure so the company can’t comply with government requests to access that information. 

“For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe,” said Tim Cook, Apple chief executive officer, in a letter to customers during the height of Apple’s fight with the FBI in February. “We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”