Tech Innovation

Assistive Technology for Visually Impaired Uses Intel 3D Cameras

Ken Kaplan Executive Editor, Intel iQ Twitter

Wearable prototype uses Intel RealSense technology to bring better awareness to people with low or no vision.

What started off as a not-so-simple challenge last August in a tiny lab inside Intel headquarters quickly evolved into a promising portable prototype that has the potential to help blind and vision-impaired people gain a better sense of their surroundings.

The wearable, environmental sensing system, which uses Intel RealSense 3D camera technology, made its public debut at the 2015 International Consumer Electronics Show. Intel CEO Brian Krzanich said that 39 million blind people in the world and 250 million people with impaired vision could potentially benefit from this technology in the future.

Krzanich invited to the stage Darryl Adams, a technical project manager at Intel who was diagnosed with Retinitis Pigmentosa (RP) nearly 30 years ago. Photoreceptors in Adams’ eyes are declining, weakening his peripheral vision and ability to see in dim light or at night.

During an emotion-filled few moments on stage, Adams described how the technology helped him. The Oregonian called it the most compelling technology demonstration during the keynote.

realsense-keynote-2

“If we can bring vision to PCs and tablets, why not use that same technology to help people see?” asked Rajiv Mongia, director of the RealSense Interaction Design Group, referring to Intel RealSense technology, which is becoming available in new computing devices hitting the market.

Mongia, along with Chandrika Jayant, Robert Cooksey and Sarang Borude (pictured at top), is part of a multidisciplinary team of design, human-computer interface, human factors and prototyping experts that have been focused on finding natural, intuitive and immersive ways to use the RealSense 3D camera technology.

“Computing and sensing together can give people awareness of their environments while they’re moving around inside or outdoors.”

In the early days of development, the team experienced a Star Trek moment when Mongia remembered the episode featuring a blind woman wearing a smart dress that gave her computer vision: ‘Star Trek’ – Season 3, Episode 5 — “Is There In Truth No Beauty?”

Getting the RealSense technology to work on the human body meant Mongia’s team had to create customized clothing fitted with the camera and a computing module that connects wirelessly to eight thumb-sized vibrating sensors: three across the chest, three across the torso and two near the ankles of each leg.

DSC_2405_smaller

“In the first prototype, we positioned the camera so it sees a person’s normal front-face field of vision,” said Mongia.

It is able to ‘see’ objects within a couple yards of the user, and it is able to tell the user approximately where the object is located: high, low, right or left and whether the object is getting closer or moving away.

When the wearer is walking and approaches an object, such as a wall or another person, the sensor boxes vibrate. Vibrations intensify the closer you get to the object.

But there is much more work to be done, like how to graduate the intensity of vibrations and how it can be tuned for people who have different needs or preferences. These are only a few things the team are exploring as they continue to test the gear with people walking around various Intel campuses.

Mongia said that people who’ve tried it say the prototype has promise, that it augments their senses and helps them feel the environment around them.

“When you see it help someone, you get the sense we have something valuable here,” he said.

One of those people is Adams.

_DSC6782_small

“I’m enthusiastic about what it can do,” Adams said after his third day of testing it.

“My visual field is limited to less than 20 percent of a normal visual field,” he said. “A primary issue for me is that I miss the majority of visual stimuli that occurs in my near vicinity…the area where most social interaction occurs.”

He often misses handshakes or other subtle social cues, and often people just seem to appear out of nowhere.

“Because I don’t see them approach, I am surprised and am not ready for that informal social interaction that would be the norm.”

Building the Prototype

In recent years, a variety of computer vision technologies have been used by others to experiment with ways of bringing computer vision to help who can’t see. Most have been bulky, too expensive, low quality or unreliable.

Intel shrunk the sensor components and reduced the power requirements so that the RealSense cameras can be integrated into many types of devices.

“This system is high resolution,” said Adams. “It’s like sonar telling you something is in your way, but this one sees more details.”

The first prototype consists of a backpack to hold the Intel Core processor-powered module that computes what the cameras sees and triggers the haptic vibrations on the wearer’s body.

DSC_2464

Mongia said that the system could also be programmed for facial, body skeleton and object recognition.

“But overall it must remain simple and responsive,” he said. “It must avoid cognitive overload on the wearer.”

And it needed to work inside and outdoors.

What It Can Do

The critical space around a person can be measured at arm’s length, said Adams, because that’s our personal space. Things out of periphery are less important.

“This system could allow someone like me to focus on social interactions versus environmental aspects,” he said. “It would lessen the tradeoffs I have to make whenever I’m multitasking.”

The prototype removed his need to be constantly looking for changes in his immediate surroundings because vibrations alert him that something has changed.

“It augmented my visual limitation with a well-established sense of touch,” said Adams.

The system identified objects in blind spots and then described what it sensed by a specific vibration. Adams said this allowed him to remain engaged and in the moment.

“It gave me a sense of increased confidence concerning my ability to make sense of and appropriately respond to the immediate world around me.”

Adams imagines how people could use the information in addition to a cane or guide dog to gain a deeper, higher-resolution understanding of their environment.

Adams and Mongia talk about how the system could be programmed for object recognition and even people or facial recognition. They discuss other output types beyond vibration.

“Audio could be used to identify an approaching person by name,” said Adams.

Mongia shared a few other ideas for the technology.

“Imagine its use in autonomous machines or in sports, providing a rearview mirror for cyclists or runners, but doing so in a way that doesn’t cause interference.”

In Krzanich’s keynote address, he finished his segment with Adams by talking about how RealSense and wearable technology can change people’s lives and that Intel is committed to bringing these types of solutions to people everywhere.

“We’ll be making this wearable technology openly available to the broader ecosystem later this year,” Krzanich said. “We’ll make the source code and the design tools publicly available so developers can extend and improve this platform. Anyone can use this to help the visually impaired and build upon it.”

 

Editor’s Note: For more on this and other stories from the 2015 International Consumer Electronics Show, watch the replay of Intel CEO Brian Krzanich’s keynote address.

 

Share This Article

Related Topics

Tech Innovation

Read This Next