How researchers are combining depth camera and Internet of Things technologies with artificial intelligence to create a wearable that augments sensory awareness for people living with impaired vision.
Photoreceptors in Adams’ eyes are declining, weakening his peripheral vision and ability to see in dim light or at night. He sees the world in out-of-focus fragments that he pieces together into a blurry mosaic.
But a coworker’s makeshift body sensor system that transforms sight into feeling is giving Adams hope.
“For years I have been thinking about how I could use technology to improve my vision or to potentially replace what I’m missing,” said Adams, a program manager who has worked at Intel for over 20 years.
Dubbed Intel RealSense Spatial Awareness Wearable, the latest prototype is powered by an Intel Joule compute module, an Intel RealSense depth camera, several element tinyTILE boards featuring Intel Curie, data processing and haptic technologies. The prototype ‘sees’ objects a couple yards ahead and around the wearer. It identifies the location of objects: high, low, right or left.
When the wearer gets closer to an object, the system triggers thumb-sized vibrating sensors: three across the chest, three across the torso and two near the ankles of each leg. The closer the object, the more intense the vibration.
“The potential was immediately obvious,” said Adams, who has been testing and helping improve the prototype since it was first publicaly demonstrated at the 2015 International Consumer Electronics Show.
His field of vision is limited to less than 20 percent of a normal visual field, so he misses visual stimuli that is so important for social interaction. He often misses handshakes or other subtle social cues, and often people just seem to appear out of nowhere.
“Because I don’t see them approach, I am surprised and am not ready for that informal social interaction that would be the norm.”
He said the wearable prototype increased his confidence about making sense of and appropriately responding to his immediate surroundings.
“It showed that I can rely on something other than my limited sight to understand what’s around me.”
Since the first prototype was built in 2015, Cooksey and his teammates improved the design to make it more portable. Once it’s worn, the wearable brings ambient spatial awareness of the environment, boosting peripheral vision, said Robert Cooksey, designer and research scientist with Intel’s Maker and Innovator Group.
“Working with Daryl has been a gift,” said Cooksey. “He’s worn our prototypes more than anyone, and he has really helped us understand how to make them work better.”
Putting Prototypes to the Test
While the technology can’t reverse blindness, Adams and Cooksey believe the 39 million blind people and 285 million people with impaired vision — numbers according to the World Health Organization — could benefit from it if product makers realize the potential.
Cooksey hopes a new prototype design will inspire device makers to create and sell their own devices.
“If we could make it easier for developers and inventors to take this all the way to market, then that would make my dream come true,” he said.
In fall of 2016, ophthalmology and visual science students at the University of Iowa led by Dr. Stephen Russell, MD, built a similar prototype based on the open-source design shared online by Intel.
“Dr. Russell knows technologies aimed at combating diseases for retinal degeneration, and he saw Intel’s prototype as a simpler way to develop a device,” said Dylan Green, a senior undergraduate and research assistant working on the University of Iowa project.
Green said that while FDA-approved retinal implants available today help strengthen central vision, Intel’s prototype is one of the only devices that addresses peripheral vision loss.
Green’s team made several improvements to the Intel blueprint to create a more streamlined device that was easier to put on and operate. The team now has 10 prototypes – called LEO Belt, which stands for Low-vision Enhancement Optoelectronic Belt – that are being testing with people in a makeshift obstacle course.
“The main goal for our project is to help bring more affordable alternative solutions than what’s out there today,” said Green. “We plan to go through all the medical hoops, show it’s a useful device and increase public awareness of it. We want to help speed the process of getting this technology out there.”
Green said the technology is evolving rapidly. His team already switched from using a Microsoft Surface tablet computer to an Intel Compute Stick, which is half the size of a deck of cards.
He said new compute modules and camera technology keep getting smaller and more powerful, which will lead to better devices that enhance people’s lives.
Intel’s original prototype, which was a patchwork of parts that didn’t connect so easily, is about to go through a significant hardware upgrade, said Cooksey.
“I remember the first time I wore the unit,” he said. “I closed my eyes, held the laptop in my hands and walked around the lab with a camera duct-taped to my chest. I could get a feel for the shape of the space and was able to navigate around the lab.”
That was in 2014, when Intel didn’t have the kind of portable computer modules that exist today. Cooksey is using new hardware to build a better prototype.
“Right now I’m building a battery-powered Intel Joule module connected to the most recent Intel RealSense camera, the ZR-300,” he said. The new camera features significant improvements to the version Cooksey used on the first prototype. He said these improvements could enable more capabilities for future prototypes.
The Intel Joule compute module includes the Bluetooth and Wi-Fi radios plus memory and a variety of USB connectors. It allowed him to make a smaller prototype with improvements and new capabilities.
“The Intel Joule module provides processing power to simultaneously run the depth camera and other functions such as object recognition or person detection,” he said.
Cooksey is also exploring ways to build in artificial intelligence that might help the device adapt to its owner and automate some functions by leveraging services powered by cloud computing.
Cambrian Explosion of Innovation
Cooksey credits colleague Rajiv Mongia, director of experience and outreach for Intel’s Maker and Innovators Group, for sparking the idea that led the Intel team to create the Spatial Awareness Wearable prototype. Mongia sees this innovation as an example of what’s driving the 4th industrial revolution.
“In the first three industrial revolutions, people were often challenged to get their hands on the technologies necessary to innovate. As a result, innovation was limited to those that had access to the financial or knowledge resources for them to pursue their dreams,” said Mongia.
“What’s driving the fourth industrial revolution is accessible technologies that make it easier for tens of millions of people to innovate.”
When powerful technologies are accessible to more people, they can experiment, prototype and bring to market. Mongia said this will create a wave of innovation similar to the Cambrian Explosion in biology, which led to the variety of species that we have today.
He said that having a whole compute system on a module like Intel Joule allows people to turn their ideas into test products without going through challenges such as detailed board design, validation, software development and certifications.
“You can scale production from one hundred to ten thousand devices quickly without requiring a big investment,” said Mongia, adding that people can use Kickstarter or other grassroots funding to develop their ideas.
“Innovation in the next 10 years is going to be mind boggling.”
Using sensory substitution, like the SAW prototype, to help people make better sense of the visual world has tremendous potential, said Adams.