Technologies that can understand human emotions, and respond appropriately, may change the ways we use technology forever.
Collectively, we hate automated phone systems. You hate `em; I hate `em; we all hate `em.
While we might instinctively react to the idea of technology responding to us individually as being “creepy,” it is exactly these types of scenarios where we may appreciate a more “personal” touch.
If Spike Jonze’s Her seemed like a far-flung future, the reality is that today designers are considering the different ways sensors are coming online to probe deeper into our mental states with ever-increasing sophistication and capacity for learning over time.
To what end and extent should these technologies react to our emotional states and deliver tailored content is a grey area loaded with both positive and nebulous implications — just ask Facebook how its most recent foray into manipulating newsfeed content to gauge its users’ emotional response was received by the public.
In asking and answering these questions, we may just discover that systems with a built-in emotional responsiveness have much more to offer than just personalizing our experience interacting with robots over a few phone calls.
In a recent article for The Atlantic, Carla Diana, a fellow at Smart Design and founder of the Smart Interaction Lab, elaborates on how technology can become more attuned to its users.
“Smart objects are producing a connection continually enriched by new information, updated apps, and the object’s ability to respond in a sophisticated manner,” she wrote. “What if it could anticipate your mood and do something slightly differently to pick you up when you’re down or help you celebrate something good?”
In line with Carla’s thoughts, a company out of the United Kingdom called Emoshape Ltd launched a successful campaign on the crowdfunding site Indiegogo. Their product, the EmoSPARK, is an artificial intelligence console built around an EPU (Emotional Processing Unit) microchip which lets it create an Emotional Profile Graph for any person with whom it communicates.
As it becomes more familiar with your personality and your interests by tapping into online social networks and media services, EmoSPARK will make recommendations for content that could have a positive effect on your mood, like a clip from YouTube or a photo you were tagged in on Facebook.
The EmoSPARK runs on Android, and users can interact with it with tablets, computers and TVs, or anywhere they’ve got their smartphones.
If you’re not quite ready for a full-on computer companion, there are still a lot of ways that emotionally-responsive technology will be helpful. In a collaboration between the Ecole Polytechnique Federale de Lausanne, a Swiss Federal Institute of Technology, and PSA Peugeot Citroen, the French car manufacturer, aims at addressing dangerous driving by teaching cars to recognize signs of stress, fatigue and road rage through an infrared camera located on the dashboard.
When we see other people looking angry, we know to adjust the way we interact with them. Now Imagine if cars had similar identifiers that let other drivers know to keep their distance while, inside, perhaps a subtle change in lighting could … lighten the mood?
Though the concept only works to identify emotions at this stage, maybe the next pit stop you make to grab a coffee may come by way of recommendation from your car.
On that note, Synqera, a Russian company, has developed a system that responds to the emotions of grocery shoppers by offering them discounts if they show signs of having had a positive in-store experience. Using a dedicated platform, which includes touchscreen, tablet-like devices placed at checkout counters and kiosks, as shoppers go to pay, a facial recognition technology automatically scans the store’s loyalty scheme database for details around their shopper profile.
By locating data around purchasing histories, the system is able to provide offers based on both past history with the store and whatever emotion that individual is experiencing in that moment. Though the company assures us that all data is gathered anonymously, of course there are still considerations to be had in terms of privacy.
It can be frustrating, sure, but automation isn’t always bad. Consider two technologies used by pretty much every big bank: automated teller machines and those automated phone systems from before. Both of these customer-facing systems are designed to help us help ourselves, as long as we know what we need. But there it is. With ATMs, we have a clear goal. Need money? Get money. With calls though, it could be anything from a lost card, to credit fraud, or even questions about a second mortgage. Who knows? You do, but the system does not, and conveying your needs to a robot (no matter how friendly it’s voice) can sometimes be a hassle.
A team of researchers at Universidad Carlos III de Madrid and Universidad de Granada, two Spanish universities, built a computer that can analyze 60 different sound patterns in a human’s voice and suss out the warning signs of negative emotions like anger or boredom. It can also reference stored data from earlier customer interactions and statistically determine what actions a caller might take. By combining those two insights, the system can quickly assess whether it can handle your issue, or if it would be better to get you to a live agent and transfer you right away.
The beauty of human interactions (and the bane of automated ones) is that, being human, we’re able to empathize with one another. We can draw on our own experiences and shape our interactions accordingly. It’s a special gift that makes living together in large societies possible, and often enjoyable.
As technology grows to take on an ever-increasing role in our lives, maybe it’s time that we start building systems which learn to empathize with us as well.