Researchers in the University of California, Santa Cruz (UCSC) Computer Vision Laboratory will use a $1.1-million grant from the U.S. National Institutes of Health to develop an indoor backtracking system for blind users as an iPhone app.
The system, developed by former graduate student German Flores (now a Ph.D. holder working at IBM's Almaden Research Center), that supports both path backtracking and map-based wayfinding.
As an app, the system will uses an iPhone's inertial sensors to guide users back along a path they have taken; it also will support map-based wayfinding.
Says Roberto Manduchi, a founder of UCSC's Computer Vision Lab who is working with Flores to develop the iPhone app, "Smartphones nowadays have a lot of sensors—accelerometers, gyros, magnetometers—to do things like step counting, so you can tell what direction you're walking in, or when you're making a turn. Put all this information together with a little artificial intelligence and you have the phone effectively tracking your location, especially if you have a map."
Manduchi and Flores presented their initial results in April at the 2018 ACM Conference on Human Factors in Computing Systems (CHI 2018) in Montreal, Canada.
From UC Santa Cruz
View Full Article
Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA
No entries found