The road trip to self-driving cars is filled with philosophical and technological potholes that are helping engineers build reliable, safe systems.
Talk of self-driving cars started way back in 1939. At the 1939 World’s Fair, General Motors’ Futurama exhibit hinted at a new era with cars controlled by technology-embedded roadways. That early notion of autonomous cars focused less on the vehicle and more squarely on the infrastructure that would support them.
Back then, it was more difficult to believe in driverless cars, but decades of technological advancements – and partnerships across different industries – are making autonomous cars a reality. A BI Intelligence report estimates that ten million cars will hit the road in 2020 and IHS predicts autonomous car sales will hit 21 million by 2035.
Fast-forward to today, when sensor technology, computer processor capabilities and machine learning – the ability for computers to accumulate, learn and create mathematical algorithms based on vast amounts of data – have created what Intel’s Jack Weast calls the “perfect storm” of capabilities that will allow cars to drive themselves.
“It’s taken us this long to get to the point where the fundamental science behind autonomous driving is mature enough that we could actually bring it to reality,” said Weast, chief systems engineer for autonomous driving solutions at Intel.
Switching Gears to Full Autonomy
SAE International classifies vehicle automation using five levels. Levels 1 and 2 require a human to still have hands on the wheel – things like adaptive cruise control, responsive braking and parking assist. Level 3, put cars on “auto pilot” but a driver can take back control when needed.
Level 4 requires even less driver support so that a passenger could even take a nap, and Level 5 is where a car is fully automated, where there may not even be a seat or controls for a driver.
The realisation of autonomous or self-driving cars requires an almost unthinkable amount of data crunching. Weast said today’s mobile phone plans allow for a couple of gigabytes of data every month, including streaming of videos and music. By comparison, an autonomous vehicle will chomp through terabytes, of data every hour. That’s too much data to send to the cloud, so high-performance computing is needed in the car to process an incredible amount of data in real-time.
An autonomous car is loaded with sensors such as cameras, lidars and radars that uniquely sense information about the environment around the vehicle. Cameras might see a person, for example, but radar can sense depth, recognizing the difference between a real human and, say, a cardboard cutout of a person. The whole system must work in tandem.
“You need to not only have these high-fidelity sensors, but you also need, high-performance central-brain intelligence inside the vehicle to be able to process that sensor data, create a virtual environment representation of the physical world that’s around the car, and then make very complex and very fast, real-time decisions about what to do in terms of steering, braking or whatever,” said Weast. “And that’s a closed loop that must exist within the vehicle.”
Weast said part of a car’s autonomy will come from its ability to be situationally aware all by itself.
“You’re trying to enumerate all the different possible objects that exist in the real world, particularly in complex, urban environments, where there could be dozens of unique objects to recognize and understand. And the car needs to be able to accurately identify all of them,” he said.
With large captured data sets, data scientists can analyze all that recorded sensor data and create machine learning models that will enable an autonomous vehicles to “learn” and then make probabilistic determinations about what it is seeing, based on its training.
A New Way of Thinking about Transportation
While the technology to make all this happen is quickly developing, the question remains whether a computer will ever be able to replace human intuition.
“The idea of autonomous vehicles is as much a human and social challenge as it is a technological challenge,” said Matt Yurdana, a UX creative director working on autonomous vehicles at Intel. “If people don’t feel that they’re safe, if they don’t have an emotional and psychological understanding that they can trust it, it will not matter what we do technologically.”
Take the medical profession as an example, said Yurdana. While some high-tech labs share information over sophisticated digital servers, other offices are still printing medical records and filing them into paper file folders.
Autonomous vehicles may also change peoples’ ideas and priorities about transportation, and eventually we might opt to be transported in a totally different way. Running errands and struggling to find parking, or sitting in traffic and getting road rage might fall away when other options become available.
“Autonomous vehicles are going to change how people live, work, how we build and design cities. It will change and disrupt industries in all sorts of ways that we can’t even imagine at this point,” said Weast. “In the same way that petrol-powered cars changed the world, I believe autonomous vehicles are going to change our lives and change the lives of people around the world. It’s a really exciting evolution.”