The same systems continually monitoring contextual individual data like location and activity are activating a network around timely support and care.
The concept of “getting to know someone” is the embodiment of a larger, more abstract process. Metaphorically, it’s peeling back the layers to find out who a person really is and then deciding how closely you’d like to be associated with them. This increased level of intimacy is not without its own set of emotional challenges, yet despite them, we yearn for these deeper bonds, knowing how enriching and essential they are to our health and happiness.
We change careers, move to new cities, get married, grow older and bear witness to our various life stages pushing and pulling at the bonds we’ve built over time, all while further complicating things by adding more relationships to the mix. And in each one, we pick and choose what information we share and with whom based on our own complicated set of rules.
It’s within these complex relationship structures that designers are considering how technology can play a supporting role. Some are proposing solutions that make the process of sharing the personal and sometimes intimate details of our lives with others much easier, therefore strengthening our existing relationships and helping us with form new ones along the way.
In a trend we’re calling “Shared Awareness” we notice that wearable and mobile devices are automatically capturing and broadcasting contextually relevant information at key moments to enable a seamless flow of communication between people.
These systems can begin continually monitoring individual data like location and activity level to deliver a preprogrammed set of notifications to a trusted peer group, activating a network around timely support and care.
Intel Labs research scientist Margaret Morris explores how technology can better connect us and ensure richer, fuller relationships between loved ones. Along with the form these devices will take, she talked to us about the implications of mining previously unavailable data sets to uncover insights into ourselves and entire demographics at large.
What kind of interesting data could be communicated through sensor technologies over distance?
So much is possible here. For example, data from sensors embedded in the environment can reflect the rhythms of daily life: routines and deviations from those routines that may indicate if someone is ill or lonely. Through physiological monitoring, e.g., of heart rate or electrodermal activity, via sensors in wearables or clothing, we can get a sense of another person’s emotional and physical states.
From connected devices such as Nest, we can infer if another person is feeling too hot or cold, and whether they are asleep or away, etc. Ingested and implanted sensors will allow people to share more in-depth physiological information. This kind of data is relevant not just for just checking in one person, e.g., monitoring the sleep of an infant or the activity of an elderly parent, but for understanding population patterns. Sensor data will refine the way we categorize and treat many types of health conditions.
Is there an opportunity to take advantage of the real‑time nature of the these tools?
Definitely. The loop of monitoring and feedback has to be closed. Connectivity among devices allows us to experiment with more complex and interesting loops than in the past. The feedback I receive from the sensing of my mood, for example, might come in the form of ambient lighting. Feedback pertaining to the sensing of emotion and interpersonal dynamics has to come in ways that make sense socially.
Currently most feedback appears on the phone, but that will expand to clothing or environmental nudges. Imagine a system like Tinder makes a connection between two people at a party or a cafe. Perhaps something is projected on a wall in front of them to stimulate conversation. Artistic experimentation is needed along with technical innovation and social science to figure out what will work and what will be appealing.
Any tools out there which will aid in how we connect with others specifically?
People are already sharing their data to enhance relationships. For example, with apps such as Waze, many share their location to let someone else know where they are and when they will arrive.
This kind of sharing provides reassurance and empathy for the person in transit. In addition, peer-to-peer knowledge sharing in these apps about police officers, traffic jams, etc., allows people to bond with fellow drivers. There is a lot of opportunity to help people share data about their activity, physiology and environments in ways that enhance intimacy rather than just broadcasting.
What’s important about the form that these actual sensors could take? Could they become more embedded? Will more devices become involved?
Sensors are certainly becoming embedded in our environments, our devices and our bodies. Connected devices such as Nest capture interesting data about our patterns and thermal preferences.
The phone is a powerful intermediary, sensing both the ambient conditions and our physiology and behaviors. We may soon see a direct control of environmental conditions, such as lighting and temperature, based on a sensing of internal states.
Sensing of the body is increasingly embedded, with powerful advances in wearables, such as contact lenses that sense early indications of diabetes, along with electronic tattoos and ingestible pills that sense temperature or transmit images of the digestive system.
What other types of analysis (and feedback) could be offered?
An enormous amount of linguistic data is generated from text messaging, emailing and social media. This data can tell us about how different types of people communicate and how relationships evolve.
My colleagues and I have been exploring how to translate this data into real-time feedback, to help people enhance their relationships and experiment with new styles of self-expression.
In the Verbalucce project, feedback about how “in synch” you are with someone else comes up as you are composing an email. It invites you to think about goals for a particular relationship and to edit accordingly.
How does this play out?
Over time, our computing systems will learn how conversations happen across different types of relationships and across cultures. They will be able to offer guidance that helps people communicate more effectively. In the vein of spell check, our computers may give us relational assistance. Let’s say you are writing to someone very different from you in culture and age. As you are writing, deleting and rewriting, you might see a suggestion for an appropriate way to say ‘I’d love to see you again.’”