New technology monitors outside your vehicle, but soon it could determine what’s happening inside.
The term “self-driving cars” compels us to think about the Flying Ford Anglia in Harry Potter, or the futuristic spaceship pods in the “Jetsons.” But while we are light years away from cars that actually fly through the sky, we are getting closer to our cars being smart, savvy and, at least partially, autonomous.
But a variety of safety and legal implications accompany the evolution of driver-assisted and self-driving cars, and some organizations have predicted a much longer timeline for their implementation.
Regardless, the pace of innovation is quickening. One key advancement takes the focus off of the vehicle, its systems and the roadway, and turns the technology inward to the driver.
Seeing Machines is an Australian company that has developed a camera system and accompanying software to monitor drivers. The system watches the vehicle’s operator to determine if and when that person is getting drowsy, distracted or just plain not watching the road.
This is a common problem. The Anti-Snoozer app won AT&T’s Developer Summit Hackathon earlier this year. Using Intel Edison and Intel RealSense technology, the app uses facial recognition to determine if a driver is getting drowsy.
In collaboration with Intel and Jaguar Land Rover, Seeing Machines implemented one of their driver monitoring systems into a Jaguar F-type for display at the Consumer Electronics Show in Las Vegas earlier this year.
“We’ve collaborated on a number of projects and we started talking to Intel’s automotive group…to do something really cool for CES,” said Seeing Machines vice president Nick Langdale-Smith.
They all met to brainstorm at Jaguar Land Rover’s new Portland, Oregon, research center, and six months later their prototype was ready.
The resulting convertible Jaguar has a steering-wheel-mounted camera that can “see” where the driver is looking — even if he or she is wearing sunglasses. The camera and software determines if a person is getting drowsy by monitoring eyelids. It can detect the slow, progressive listing of the head to one side, which often means the driver is falling asleep.
With driver distraction leading to over 3,000 deaths and 400,000 injuries in 2012, and 21 percent of all fatal car crashes between 2009 and 2013 involving a drowsy driver, inward-facing systems that monitor the driver are not only welcomed, but finally possible with today’s technology.
When the software is married with external monitoring systems available today in many newer cars, there’s instant blind spot monitoring or collision detection.
“We integrated our driver monitoring system into a touch screen,” said Langdale-Smith. “If the driver monitoring system measures that the driver is looking at it instead of the road and the system detects a traffic situation, the In-vehicle Infotainment (IVI) will display a warning message.”
The Jaguar F-type project required the skills of each of the three partner companies, according to Matt Jones, a senior technical specialist at Jaguar Land Rover.
“Intel, looking at the hardware compute platform, Jaguar Land Rover understanding how it would best integrate with the vehicle–working on the camera integration and all those hardware aspects of bringing this into an automotive demonstration–and then Seeing Machines, bringing their algorithms and their software and the camera technology,” he said.
“And then we all worked together to make sure that all of that software and hardware was integrated.”
This Driver Monitoring System is certainly impressive, and can potentially save lives, but is it truly necessary if self-driving cars are right around the corner?
“There is a lot of hype and a lot of interest around autonomous or semi-autonomous road-going vehicles,” Langdale-Smith said. “But the timeframes to really remove the driver completely are much longer than people might believe.”
He mentioned that despite legislative progress, there are still a lot of regulatory and legal barriers that must be crossed. Perhaps most notably, accident liability.
“There are [historical] legal frameworks that place liability on the driver of the vehicles,” said Langdale-Smith. “We’re wading into very interesting territory in the next five to ten years around where exactly does that liability sit.”
Those conversations are already happening. Believe or not, there are arguments in favor of giving robots “legal personhood.” For Seeing Machines, Intel and Jaguar Land Rover, there’s plenty of room for technology to bridge the gap until the time when driverless cars actually become a reality.
“We think that there’s going to be a gradual growth of automated features,” said Langdale-Smith. “You don’t go from zero to 100% overnight.”
Langdale-Smith envisions a point in time where the car drives for half the time and a person drives for half the time. Perhaps when the car is on the freeway it will drive, but when the vehicle is on surface roads, a human driver is in charge.
But even then, what if the car is on the freeway and encounters a situation it doesn’t fully recognize and understand?
It may want a human to take over control but the driver could be distracted or simply not paying attention. Those situations are where a system like Seeing Machine’s — that watch the driver instead of the road — would be vital.
“As the level of automation increases, the driver will be less and less likely to have the situational awareness at a moment when it’s needed,” Langdale-Smith said.
“You’re going to have to be paying attention for 50% of the time, then 30% of the time, then 10% of the time, then 5% of the time… that puts a lot of demands on the driver to be alert and attentive so that they can resume control of the vehicle,” he said, noting that if a driver is less engaged or bored while driving, the easier it would be to become drowsy and fall asleep.
“Until the steering wheel disappears altogether, the driver isn’t going anywhere,” said Langdale-Smith.