IoT: Smart Connected Planet

Trust and Autonomous Driving: What Will It Take?

trust and autonomous driving
by Deb Miller Landau
iQ Managing Editor

Reactions from people taking their first ride in an autonomous car help researchers design human-machine interfaces that instill trust in driverless vehicles.

What will it take for people to trust riding in a car that no one is driving?

Beyond all the technological and regulatory challenges industry experts face in developing autonomous cars, this question on how to establish and maintain trust between humans and machines keeps some industry experts up at night.

One particular area of design looking into this is called human-machine interface (HMI), which focuses on the way humans interact with machines. The HMI between people and cars is essential in paving the road for fully autonomous vehicles.

“The idea of autonomous vehicles is as much a human and social challenge as it is a technological challenge,” said Matt Yurdana, a user experience creative director working for Intel’s Automated Driving Solutions group.

Yurdana and his team are working to identify “trust interactions” — key exchanges that need to be built and designed into autonomous vehicles.

Earlier this year, Yurdana and Jack Weast, the chief systems architect of Intel’s Autonomous Driving Group, invited a mix of community members to test drive a driverless car on a closed circuit at Intel’s Advanced Vehicle Lab in Chandler, Ariz.

In this HMI test drive, participants experienced a series of five trust interactions, including requesting a vehicle, starting a trip, making changes to the trip, handling errors and emergencies, and pulling over and exiting.

Safety was the primary concern for most participants, but they also expressed confidence that autonomous cars will be safer because of their lack of human error.

“I drive a lot so it’s still very hard to just get comfortable in the back seat,” said one tester. Another asked how disabled people will get assistance if there’s no one driving the car. Another tester wondered how an autonomous car would make intelligent decisions, like whether or not to drive in icy conditions.

Understanding how the technology functioned and its full capabilities was paramount to participants. Experiencing how the vehicle could communicate, respond to road hazards and simply handle the drive helped boost people’s confidence.

People in driverless car
There’s a learning curve to become accustomed to riding in a car without an actual driver.

“Many people came in nervous and apprehensive,” said Weast, adding that by the end, “every single participant experienced a huge leap in their confidence.”

A recent AAA study found that 78 percent of Americans are afraid to ride in self-driving cars.

“We might be able to build the perfect car from a technology standpoint — and it could drive perfectly and keep you safe all of the time,” said Weast. “But if we don’t feel psychologically safe, then we’re not going to use the service or buy one of these cars.”

Humanizing Tech

Psychologists, sociologists and relationship gurus examine how to build foundational trust between people — be it colleagues, lovers, friends — but Yurdana and his team are examining how to build trust between people and autonomous cars.

“Trust equals safety,” said Yurdana. “It equals confidence and comfort that’s not only physical but psychological.”

Experts know that anthropomorphism — the attribution of human traits, emotions and intentions to non-human entities — helps people relate more to technology.

In a study published in the Journal of Experimental Social Psychology, Nicholas Epley, professor of behavioral science at the University of Chicago’s Booth School of Business, and his team of researchers found that by giving a driverless car human-like qualities, people were more able to trust the car.

The researchers divided participants into three groups. In one, participants drove a regular car, another group rode in a technically sound autonomous vehicle, and the third group rode in the same autonomous vehicle enhanced with anthropomorphic characteristics, such as a name, gender and soothing voice.

Display in autonomous car
Communication between the autonomous car and passenger is important in building trust.

In all three scenarios, an unavoidable accident was forced upon the participants.

Overall, people trusted the anthropomorphized vehicle more than the other two kinds of cars. They liked the ride better. And here’s the kicker: participants blamed the anthropomorphized vehicle car less than the non-anthropomorphized autonomous vehicle for its role in the accident.

“The very notion of attributing a human-like mind to an automated vehicle made all the difference in the world as to whether participants were willing and able to trust the vehicle,” said Yurdana.

Unlike interactions with Siri or Alexa, where the conversation is one-sided — a person asks a question and Siri answers — the interaction with an autonomous vehicle is much more intricate. It needs to not only respond, but also initiate and contextualize conversations.

For example, if someone is reading a book while the autonomous car handles the evening commute, but then encounters a detour, it might ask the passenger what route to take, weighing variables from traffic to the preference of the rider. Think of directing a taxi driver — only without the driver, said Yurdana.

autonomous car
Autonomous cars use cameras to see road hazards and read signs.

Regardless of the soothing voice and some reliability established, the interactions are about humans trusting technology, not another human being. People trust strangers all the time — on-demand car-service drivers, airline pilots, bus drivers — but it’s people trusting other people. Autonomous cars require people building foundational trust — but with technology.

For some, this is a scary proposition — everyone has dealt with glitchy tech at some point in their lives.

“Trust doesn’t come about because people understand all of the technical paths and permutations that get there,” said Yurdana. “Their trust comes from those interactions.”

For example, Mary can tell Kim that Steve’s a great guy, but until Kim knows that for herself, she won’t fully believe it. Yurdana said it’s through interactions that people build genuine trust.

“The absence of a steering wheel or pedals is going to be quite dramatic to those of us who step into a vehicle like that for the first time,” Weast said.

The HMI test drive in Arizona was the first of its kind, and it’s just the starting line.

“That’s why creating a trust relationship between the human and the machine is so vitally important,” Weast said.

 

iQ newsletter signup

Share This Article

Read Full Story