Adding human-like senses and artificial intelligence to VR headsets is unlocking a new reality that’s stretching the imagination.
Imagine sitting in a car and strapping on a headset that allows the driver to use her steering wheel to weave through a life-like simulation of the autobahn. Then the driver stops, steps out of the car and walks around scattered ruins of a medieval castle, digitally reconstructed to all of its glory, viewable in 360 degrees.
This and other kinds of immersive merged reality experiences are poised to become more prevalent in 2017, as head mounted displays (HMD) rapidly evolve with powerful processors, human-like senses and artificial intelligence.
“The way we entertain ourselves and consume information is going through a revolution,” said Achin Bhowmik, vice president and general manager for the perceptual computing group at Intel.
The rise of immersive virtual reality (VR), augmented reality (AR) and mixed reality (MR) experiences is powered by human creativity and technology sophisticated and powerful enough to simulate the fundamental laws of nature and replicate them in digital worlds (read VR’s Breakthrough Moment). Bhowmik’s team wanted to take these experiences further by combining Intel technologies that power VR, AR and MR − including processors, graphics, memory and software − and adding artificial intelligence and human-like sensing from Intel RealSense camera technology. The result was Intel’s Project Alloy, a headset reference design helping product makers create new devices and experiences that can digitize the real world and bring it into virtual reality.
Bhowmik said the ability for devices to sense and understand the environment around them is opening new possibilities for blending real and digital worlds.
“When VR headsets can sense the wearer’s surroundings and bring the real world into virtual experiences, it allows you to interact with the virtual elements using your real hands,” Bhowmik said.
As an example, he explained how high fidelity merged reality might help medical students learn about human DNA and organs in new ways.
“I could put a Project Alloy headset on and see the DNA molecule in 3D, go around it, manipulate and interact with it,” said Bhowmik. “Think about interacting naturally with 3D structures of internal organs of the human body, the brain, the heart in order to understand how they work.”
Over the past five years, his team has developed technologies that bring human-like sensing capabilities to devices. In recent years, Intel RealSense depth-sensing camera technology brought 3D vision to broad range of computing devices, giving them the ability to understand hand gestures and facial expressions, and digitize objects in the real world. It’s being built into drones and robots, giving them eyes to help them avoid collisions and autonomously navigate the 3D world. Now RealSense is being integrated into VR headsets, giving them spatial and contextual awareness.
Bhowmik’s team designed Project Alloy to be an all-in-one VR headset with a full computer system. It allows its wearer to freely walk around without being tied to a computer or having to rely on external sensor systems for accurate positioning and head tracking. It was first shown at the Intel Developer Forum in September 2016 and newer versions will be demonstrated at the 2017 International Consumer Electronics Show.
“Project Alloy is an intelligent device that can sense, understand, interact and learn,” said Bhowmik.
“It captures the world in 3D in real time, runs 3D computer vision algorithms to understand what it sees,” he said. “Machine learning capabilities allow it to understand human intent such as hand gestures. With the headset on, you can simply lift and see your hands and use them naturally to interact with virtual objects, without relying on an external controller.”
Unlike VR experiences today, merged reality lets people see and use their hands to bring real objects into a virtual environment. An Alloy headset wearer instantly sees his dog in VR as the dog walks into his real-life bedroom. He can see his hand reach out and pat the dog.
Bhowmik explained that being able to see real human limbs in VR can ward off subconscious reactions with correct proprioception cues, the lack of which can make people feel queasy after spending time in VR.
“To create an emotional connection with merged reality experiences, Project Alloy headsets bring an enormous amount of 3D data from the real world to make it feel natural,” he said. “It requires a tremendous amount of fast data capture and processing.”
To transport the real world into VR in high fidelity, the new Project Alloy headsets capture and processes more than 50 million 3D points per second. Bhowmik said this will help developers create new merged reality experiences that immerse people – from almost anywhere they are – into favorite travel spots, space exploration, learning and entertainment experiences.
Project Alloy headsets are being made available to device makers around the world, making it easier for developers to create new merged reality products and applications.
Meantime, Intel Real Sense technology is going into all kinds of computing devices and systems, making them more intelligent and autonomous. Bhowmik said it took nature billions of years of evolution to develop sophisticated human perception comprising of rich 3D visual system, a binaural hearing system, skin connected to a nervous system that’s sensitive to touch, and senses of smell and taste. This high functioning sensory network feeds information to a powerful brain with incredible processing capabilities.
It took only a decade for digital devices to sense like humans, due to the rapid pace of perceptual computing innovation. The ability to learn and adapt to what devices sense is right around the corner.