Witty, personable and able to anticipate your needs — those may not be the first words you think of when describing your smartphone. But with the help of Apple’s Siri, Google Now and Microsoft’s Cortana, our mobile devices may soon become our most trusted assistants.
Siri introduced one of the key functions of today’s mobile assistants: helping users find and access information through voice commands. However, the emergence of predictive systems like Google Now and Cortana may be ushering in a new generation of digital assistants that support and streamline daily routines in our personal and professional lives.
Gartner predicts that by the end of 2016, the digital assistant will be one of the most accessible and popular mobile applications, responsible for handling users’ most complex decisions. Aside from scheduling nights out or refilling groceries, digital assistants are expected to autonomously purchase more than $2 billion in goods and services online.
So, how complex, sophisticated and independent can these apps truly be?
“If I think of ‘personal assistant’ and what it takes to make a human personal assistant great, it’s the [ability to] get inside your head, [to] anticipate what you need,” said Lama Nachman, principal engineer and manager of the Anticipatory Computing Lab at Intel Labs. “[In the past], the technology has not been able to do that.”
Today’s digital assistance is largely based on automation, where users can verbally ask questions, get directions or compose emails and texts. While Siri serves queries upon command, Google Now attempts to anticipate requests. By analyzing activity on users’ phones and personal information gathered from its various services, like Google Search, Gmail and Google Maps, it’s able to keep users continuously informed with daily schedules, travel itineraries and real-time updates on sports teams.
In April, Microsoft unveiled Cortana, a Windows system that blends the more human traits of Siri with the anticipatory functionality of Google Now, and was in fact modeled after real-life personal assistants.
“We did some research and found that people are more likely to interact with [AI] when it feels more human,” Susan Hendrich, principal program manager lead for Windows Phone, told Engadget. Hendrich went on to describe Cortana as “eager to learn” and “downright funny, peppering her answers with banter or a comeback.”
To understand its user, Cortana maintains a virtual notebook of approved, personal information — users can grant or deny access to email and even share personal interests like favorite foods, restaurants or movies to help the system populate more relevant suggestions.
With the advent of these new predictive systems, Nachman said there will eventually be a shift from reactive or responsive programs, like Siri, to systems that are fully able to anticipate our goals and inclinations. But, since many of these technologies leverage and learn from user data over time, it’s often difficult to make the systems as proactive as could be until they’ve collected enough information to analyze. The process, she admitted, is an evolutionary one.
“There’s tons of information and being able to decipher what you care about is a bit hard,” Nachman explained. “So how do we string different things together and help you view all the clutter? [The digital assistant] needs to comprehend my routine and find that relevance. But there is more likelihood or chance to make a mistake. It’s that trade-off that we continue to work with from a technology standpoint.”
In 2013’s “Her,” Spike Jonze envisions a time when machine-to-human interaction is as emotionally complex and intimate as relationships between real people. Margie Morris, a clinical psychologist and senior researcher at Intel Labs, noted that although these ambitious systems may not be perfect, forming a bond with them will become a part of the experience.
“We might start thinking of the technology as an advisor or peer rather than as an assistant.” Morris said “Instead of giving us exactly what we ask for, the technology could challenge us a little, by interpreting our requests or redirecting us to choices that are more aligned with our values. So if I ask for the nearest Krispy Kreme, my phone show me that along with a nearby place for fresh pressed juice (presuming it knows I like both). In this way, tech could help us align our momentary choices with our long term intentions. ”
As Nachman and Morris look to the future of digital assistants, they see an incredible amount of potential in the health and wellness space, especially as data streams from a mobile device’s camera, audio and physiology inputs begin to merge.
Morris imagines a day when our phones interact with all the devices we wear and make recommendations about which friend to see by analyzing what we type, how we’re sitting or how much protein we’ve eaten that day.
“I think we’re at a really interesting point right now because there’s all of this data that’s coming in and people are being sensed in so many different ways, so we’re seeing a lot of access to physiology,” Nachman added. “I can start to figure out why the problems I’m running into are happening. And as you start to gather that, this type of assistance from a preventive and healthcare standpoint becomes much more attainable.”
As these systems begin to understand your emotions and yourself on a holistic level, eventually their assistance will be based solely on their knowledge of you. It doesn’t get much more personal than that.