NIH grant funds development of a wayfinding app for the blind

Engineer Roberto Manduchi develops assistive technologies for people with disabilities

blind person with cane navigating hallway
A blind participant tests a path backtracking app developed at UC Santa Cruz. (Photos courtesy of Roberto Manduchi)
Roberto Manduchi
Roberto Manduchi

“Have you ever been to the Baskin Engineering building basement? I took some friends of mine who are blind down there and tried to get them get lost,” says Roberto Manduchi, professor of computer science and engineering in the Baskin School of Engineering at UC Santa Cruz.

Finding their way out of the subterranean maze of humming machines and convoluted corridors beneath the engineering building would ordinarily be a nightmare for many people with visual impairment, but Manduchi’s 12 participants weren’t lost for long. They were each equipped with an app-based system that used an iPhone’s inertial sensors to guide them back.

“Tracking your way back is very important when you’re blind,” Manduchi said. “Imagine you were blind and wanted to see my office, and someone met you at the entrance and took you there, but wasn’t around to take you back. This system would have tracked you from point A to point B and could help guide you back again.”  

The backtracking system was developed by a graduate student in Manduchi's Computer Vision Lab, Germán Flores, who earned his Ph.D. in 2017 and is currently at IBM's Almaden Research Center. A new $1.1 million grant from the National Institutes of Health is now funding Manduchi's ongoing work to develop the system into an iPhone app that will support both path backtracking and map-based wayfinding.

Manduchi explained that navigation apps like Waze or Google Maps, which use the Global Positioning System (GPS), don’t work well indoors. Indoor environments also tend to be much more crowded, making navigation more complicated than it is outdoors (imagine the layout of a crowded airport terminal or shopping mall).

“Smartphones nowadays have a lot of sensors—accelerometers, gyros, magnetometers—to do things like step counting, so you can tell what direction you’re walking in, or when you’re making a turn. Put all this information together with a little artificial intelligence and you have the phone effectively tracking your location, especially if you have a map,” Manduchi said.

Using an iPhone’s inertial sensors for wayfinding makes the system more convenient than other smartphone-based assistive technologies that rely on a video-feed from a phone’s camera. Using the camera requires the user to hold the phone up, whereas with Manduchi’s system, the phone can be kept in the user’s pocket.

Manduchi and Flores presented their initial findings on the backtracking system in April at the 2018 ACM CHI Conference on Human Factors in Computing Systems in Montreal. Flores was also instrumental in another major assistive technology project that Manduchi is working on, together with engineering professors Ethan Miller and Sri Kurniawan, to make public transit services more accessible.

“We want to make buses, trains, and other vehicles much more accessible to the blind,” Manduchi said.

Called RouteMe2, the system involves a smartphone app, as well as a cloud-based computer network, physical infrastructure, and beacons deployed in transit vehicles to help identify them as they approach the user. With funding from the National Science Foundation, Manduchi is partnering with the Santa Clara Valley Transportation Authority and IBM Research on the project.

Closer to home, much of Manduchi’s other work focuses on information access for people with low vision or who have mobility issues.

“I have a student with a serious disability—cerebral palsy. He has poor motion control and his speech is difficult to understand, but he’s super smart and able to type using a camera focused on his head and moving the cursor that way," Manduchi said. “We want to create a system that would use AI computer vision to facilitate this communication, because the faster he can type, the better he can communicate.”

Manduchi is beginning user studies with Dr. Susana Chung at UC Berkeley’s School of Optometry that could help this student and others with similar mobility issues. They’re testing a system that uses the reflected glint of an infrared beam in a user’s eyes to track where someone is looking on a screen. By tracking a user’s gaze and moving a cursor along in response, the system could allow someone to rapidly scan through highly magnified type.

As with most of Manduchi’s projects, he considers cost as a crucial component of accessibility. His goal is to eventually do away with the $3,000 infrared tracking system and use the computer’s built-in camera. The researchers had initial seed funding for the project from the Research to Prevent Blindness/Reader’s Digest Partners for Sight Foundation, but are still looking for more support.

“The beauty of a smartphone or a computer is that it’s the same tool sighted people use,” Manduchi said. “It’s not an ugly assistive device that looks weird. With the same iPhone everyone else uses, a blind individual can go online or on Facebook.”

Manduchi, who joined the faculty at UC Santa Cruz in 2001, said the campus has been tremendously supportive of his focus on disabilities and accessibility. He teaches a course on “Universal Access: Disability, Technology and Society,” a general education class that reaches into realms beyond the technological aspects of accessibility, as is typical of Manduchi's approach.

“I would be very interested in seeing how these new technologies that are integrated into apps could be used for accessibility in developing countries where traditionally people with disabilities are left behind,” he said. “That’s my next big move, not pure technology, but more of a global health issue.”