Why Apple, Facebook, and Google Want to Read Your Emotions

Dominick Reuter/AFP/Getty Images

Dominick Reuter/AFP/Getty Images

Artificial intelligence isn’t all killer robots and job-eating machines; it’s also responsible for much of the software that’s going to make your smartphone more useful and indispensable than ever. (Just think of chatbots, which are shaping up to be the killer app of 2016 and in many cases rely on artificial intelligence to parse your inquiries and complete your requests.) Another way that AI is coming front and center on the smartphone? Tech giants including Apple, Google, and Facebook are readying artificial intelligence systems that can read your emotions to better engage you on an emotional level.

As Epsilon’s Steve Harries reports for VentureBeat, the capacity of artificial intelligence to respond to moods, gestures, natural language, and other complex human behaviors can solve a lot of the issues that both users and brands have noted with smartphones and smartphone software, including discovery, attention, and ease of use. And as Harries reports, Google, Apple, and Facebook are leveraging AI “in different ways to gain a competitive advantage in an economy where attention is scarce.” Here’s how the three tech giants are developing artificial intelligence-based solutions to improve the experience you have with their products.


Apple’s goal is to offer the best user experience with the combination of its hardware and software. Harries reports that because Apple’s future depends on the success of its mobile products, the company is focused on moving the user experience “beyond the limitations of interacting with a small touch screen interface.” So Apple’s recent acquisition of Emotient, a California-based company that interprets emotions from facial expressions, makes perfect sense.

Emotion-based artificial intelligence could introduce new interaction models into the user experience you’re used to on your iPhone or iPad. And it would equip Siri, Apple’s digital assistant, with a visual processing system to better detect and predict your needs. “Using Emotient’s technology,” Harries projects, “Siri will soon be able to read your facial expressions via the front-facing camera, interpret them, and assist you with whatever you need without you even having to say ‘Hey, Siri.’ This is a game changer.”


Facebook’s vision is to make the world more open and more connected. To accomplish that, the social network needs to enable users to make meaningful connections, ones with the power to cut through the information overload that bombards users from all sides. So Facebook wants to use artificial intelligence to improve the content of its Newsfeed and advance the capabilities of its Messenger platform.

Facebook is using machine learning to improve the predictive accuracy of the algorithm that populates your Newsfeed, so that it can deliver contextually relevant content (and advertising). It’s also developing new capabilities for Messenger, which will be able to facilitate actions and connections via the new M virtual assistant. And Harries notes that Facebook’s Mark Zuckerberg  recently detailed a new personal challenge of building a Jarvis-like artificial intelligence system that can run his home and accomplish complex tasks, like recognizing his friends and letting them in the front door. The same kind of artificial intelligence could appear in M, with a focus on enabling you to complete tasks and discover content.


Because Google’s goal is to organize the world’s information and make it accessible to people, Harries notes that the search engine giant’s success depends on how easy it is for users to discover the information they need when they need it. Already, Google Now recommends content based on your data, Gmail’s predictive responses enable you to answer emails with a single tap, and Google Maps can predict your destination and offer real-time traffic information related to the routes you usually take.

Google wants to grow more adept at anticipating your needs. “Imagine if it could use machine learning and the connective tissue of the Internet of Things to deliver contextual solutions before a user even has to think about what he or she needs,” Harries writes. Google Now could reorder groceries when it senses that items are getting low or old, or book a reservation at your favorite restaurant a month in advance for your anniversary. “The future of search, according to Google, is going to be about delivering contextually relevant solutions to users automatically.”

In each case, Harries points out, the technology industry’s “big three” are looking to integrate artificial intelligence in ways that enable their products to sense what’s right for individual users. Providing contextually relevant content and actions isn’t exactly a new idea, but using facial expressions to interpret emotions is a novel approach — one that you can expect to see in action in products from the top tech companies.

More from Gear & Style Cheat Sheet: