Apple’s upcoming iOS 10 operating system will include “differential privacy” technology that will enable the company to gain helpful insights on large pools of data from its customers without being able to identify them or violate their privacy.
The inclusion of the differential privacy capability was briefly announced by Apple this week at its Worldwide Developers Conference (WWDC 2016) in San Francisco as a way for it to analyze usage patterns of large numbers of users while adhering to its own strict privacy guidelines.
“Security and privacy are fundamental to the design of Apple hardware, software and services,” Apple said in a statement issued at the conference. “iMessage, FaceTime and HomeKit use end-to-end encryption to protect your data by making it unreadable by Apple and others. iOS 10 uses on-device intelligence to identify the people, objects and scenes in Photos, and power QuickType suggestions. Services like Siri, Maps and News send data to Apple’s servers, but this data is not used to build user profiles.”
Now, beginning with iOS 10, which is slated to be released in the fall, Apple will integrate differential privacy to allow it to better learn about user behaviors, according to the company. “In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.”
Apple, however, has apparently released no in-depth information about how this will work, which has some security and privacy experts concerned and wondering what it will actually mean.
“We know very little about the details, but it seems to be an anonymization technique designed to collect user data without revealing personal information,” longtime IT security expert Bruce Schneier wrote in a June 16 post on his Schneier on Security blog. “What we know about anonymization is that it’s much harder than people think, and it’s likely that this technique will be full of privacy vulnerabilities.”
Schneier, who is abroad and could not be reached for additional comment by eWEEK, wrote in the post that “while I applaud Apple for trying to improve privacy within its business models, I would like some more transparency and some more public scrutiny.”
Greg Norcie, the staff technologist for the Center for Democracy & Technology in Washington, told eWEEK that more specifics are needed to accurately gauge what the inclusion of differential privacy in iOS 10 will bring.
“Without the details, I can say that in general [Apple says] they’ve found a way to do this differential privacy, but without being able to look at the algorithms and look at the details, it’s hard to comment on their specific implementation,” said Norcie. Other companies, including Google in 2014, have used differential privacy techniques, with Google using it for Chrome browser analytics, he added.
“There’s a lot of academic research on the subject,” Norcie said. “Roughly it usually means a high-level ‘hand-the-data-off ‘ [for analysis] … without invading a person’s privacy.”
It will be interesting to learn more about the concept when Apple makes such information available in the future, he said. “It could be as simple as they are [presently] typing up a technical report about it and a week from now we will have the details. Who knows?”