Can’t Understand Other People’s Feelings? There’s an App for That

Google Glass

An app developer called Emotient has created an app for Google’s (NASDAQ:GOOG) wearable smart-glasses, Google Glass, which can determine a Glass wearer’s emotions by analyzing minute details of facial expressions. Now we can know how ‘Glassholes’ really feel when being asked to vacate a movie theater or given a ticket for driving while wearing the device.

The company’s end goal is to use the software to collect data for retailers about how consumers react to a shopping experience in a brick-and-mortar store. Emotient raised $6 million in funding to develop its “Sentiment Analysis” app that reads Glass wearers’ facial expressions through the device’s camera.

“Emotient’s Sentiment Analysis Glassware demonstrates our goal to emotion-enable any manner of device and build the next layer of automatic sensors,” said Ken Denman, Emotient’s CEO, in a press release about the Glass app. “It’s a breakthrough technology that allows companies to aggregate customer sentiment by processing facial expressions anonymously. We believe there is broad applicability for this service to improve the customer experience, particularly in retail.”

It doesn’t take a particularly creative or tech-averse individual to imagine all the different ways in which technology like this could go horribly wrong. If you’re like me and you look perpetually sour by default, then Glass could put a “Warning: Do Not Approach” sign above my head, or tell every store I walk into that I hated my shopping experience. Perhaps having a smartphone attached to our faces will make the general populous so bad at communicating that we’ll need Glass to tell us how any given person we’re speaking with is feeling.

The company says the goal of the software is to collect data in aggregate, which could help a store owner determine if a particular area of their store is more confusing, repulsive, or pleasing than others and make changes based on that information. “The Emotient software processes facial expressions and provides an aggregate emotional read-out, measuring overall sentiment (positive, negative or neutral); primary emotions (joy, surprise, sadness, fear, disgust, contempt, and anger); and advanced emotions (frustration and confusion),” the company said in a statement.

Emotient says it won’t “store video or images,” but post-NSA it seems pretty silly to take that at face value. Get it? Anyway, Emotient thinks the software will be useful to retailers in determining “aggregate consumer sentiment,” though I thought we’d already determined that everyone would rather shop onlinethan go to the store.

Perhaps it would be more useful for bumbling boyfriends and clueless Glassholes everywhere if the software could be put to use on other people’s faces. At least now its robotics efforts aren’t the only Google project cuing up visions of a dystopic future where the world is run by the tech behemoth.

More From Wall St. Cheat Sheet:

Follow Jacqueline on Twitter @Jacqui_WSCS