Apple is continuing to put blue water between itself and data-mining mobile rivals such as Google by engineering its services with a clear focus on user privacy.
At its WWDC developer conference today, senior vice president of software engineering Craig Federighi showed off an update to its Siri voice assistant, called Proactive, which will offer some Google Now-esque predictive features, such as suggestions for when a device user needs to leave to get to an event in their schedule, or inferring who might be calling from a landline number that’s not stored in your contacts, based on parsing data from your email.
But where Google Now does cloud-based processing of user data (and Google data-mines the personal information it gets access to to build detailed user profiles to sell ad-targeting intelligence to advertisers), Apple is doing this data processing locally, on the device.
Apple said user data that powers its Proactive service remains anonymous, is not linked to the user’s Apple ID or shared with third parties, and remains on the device.
This is a defining difference vs cloud-companies building big data-mining businesses.
“We don’t mine your email, your photos or your contacts in the cloud to find things out about you… all of this is done on device… under your control,” said Federighi. “We do it in a way that does not compromise your privacy… we honestly just don’t want to know. All of this is done on the device.”
While Apple’s pro-privacy approach offers a clear contrast to how many current cloud businesses handle and data-mine user data, the most obvious strategic target is Google. Ultimately this is Apple highlighting the core, underlying difference between iOS and Android — given how much feature overlap remains between the two mobile OSes.
Google’s ad-fueled business model absolutely requires massive data-mining of users. Apple, by contrast, makes the majority of its revenue from selling hardware — so the clear message coming out of Cupertino is: we can afford to protect your privacy.
Google recently revamped its Photos app, making much of a natural language feature that lets users locate particular photos by searching for generic terms, but not mentioning the background tradeoff — i.e. that Google will also be searching and cataloguing your photos and photo metadata in order to further flesh out what it knows about you.
Today Federighi showed off a similar photo search feature on iOS — demoing how users can say things to Siri like “show me my photos from last July” and have it pull up the correct images. So again, there’s obvious feature overlap but behind the scenes the Apple vs Google philosophy about handling user data is very different.
In another instance during today’s keynote, Apple noted that its incoming Flipboard-clone News app will not be sharing users’ reading habits with third parties — or with Apple itself.
from TechCrunch http://feedproxy.google.com/~r/Techcrunch/~3/8JJOmcEml0M/
via IFTTT
0 коммент.:
Отправить комментарий