Surely I cannot be the only observer to note the irony in the news from last week that Apple has bought a start-up that has created artificial intelligence (AI) technology capable of observing the facial expressions of people and extrapolating from it their emotional state. When coupled with other recent acquisitions that Apple has made – such as it’s purchase in November of facial movement mapping firm Faceshift and augmented reality (AR) company Mataio – Apple may already have the ability to be more of an Orwellian Big Brother than IBM ever was in 1984.
Apple’s iconic “1984” Super Bowl advert successfully launched the Macintosh on 22 January that year by driving a sledge hammer – quietly literally – through a virtual image of Big Blue. Today, 31 years on, Apple is clearly going toe-to-toe with the likes of Google and Facebook to become what could be the biggest brother of the 21st Century by knowing more about us simple users by mining the vast oceans of data at its disposal. And now, it wants to know how we feel.
To give credit where it is due, of these three technology giants, Apple certainly is the most privacy conscious. According to the Financial Times, Apple’s acquisition of Emotient, the small 50-employee firm from San Diego it reported buying last week, gives it access to technology that can “learn” the correlation between facial expressions and human feelings by analysing crowd-sourced photographs and other information. All this apparently without having to retain recognisable images of individual’s faces.
Emotient’s algorithm appears to be based on human intelligence used to teach machines to teach themselves about that highly elusive data set … human emotions. According to news reports, Emotient would “harvest” photos of human faces using “crowdsourcing” techniques, get experts to match the perceived emotional states of the subjects of up to 100,000 images a day and feed that data into its systems. With the right amount of accurately matched emotions to faces, the technology could learn the nuances of the subject matter in question.
At a time when technology industry luminaries such as Mark Zuckerberg are trying to build AI personal assistants, it is poignant that Emotient set out to build technology capable of reading a person’s emotions and to respond accordingly.
Unlike Facebook and Google (see Convenience will push the genie of privacy back into the bottle), Apple may remain adamantly opposed to storing customer data of any sort. It may equally not allow itself or anyone else -- the US National Security Agency included -- to access private information transmitted on its platforms. It does, however, know its customers (for basic customer service purposes), know where they are (Find My Friends), know what they like and buy (iTunes, App Store. Apple Pay and Apple Store), and can now watch their movements (Faceshift).
One hopes that the combination of such technologies would be put to good use.
Imagine Genius Bar staff pre-warned by the new AI program about the emotional state of an irate customer in line for technical support, and thus able to preemptively defuse a potentially problematic situation.
Or a highly-perceptive Siri capable of reading your emotions (with your prior permission) using your iPhone’s camera and becoming that much more of a personal assistant to you.
Or a self-driving Apple Car able to judge your ability to drive safely based on its analysis of your emotional state as evidenced by your facial expressions. Road-rage-related incidents might diminish.
It would be widely regarded a betrayal of consumer loyalty and trust if -- in the world of the Internet Of Things -- Apple were to consider pushing you emotionally targeted products on the newfangled heads-up display of the same Apple Car while you were, for want of a better phrase, a captive audience.
Here’s hoping Apple stays true to its stated goal of protecting user privacy while continuing to provide an exceptional user experience with its technology.