Sensor Fusion Localization and Navigation for Visually Impaired People
- Authors: Galioto, G.; Tinnirello, I.; Croce, D.; Inderst, F.; Pascucci, F.; Giarre, L.
- Publication year: 2018
- Type: Proceedings (TIPOLOGIA NON ATTIVA)
- Key words: Control and Systems Engineering; Control and Optimization
- OA Link: http://hdl.handle.net/10447/344417
In this paper, we present an innovative cyber physical system for indoor and outdoor localization and navigation, based on the joint utilization of dead-reckoning and computer vision techniques on a smartphone-centric tracking system. The system is explicitly designed for visually impaired people, but it can be easily generalized to other users, and it is built under the assumption that special reference signals, such as colored tapes, painted lines, or tactile paving, are deployed in the environment for guiding visually impaired users along pre-defined paths. Differently from previous works on localization, which are focused only on the utilization of inertial sensors integrated into the smartphones, we exploit the smartphone camera as an additional sensor that, on one side, can help the visually impaired user to identify the paths and, on the other side, can provide direction estimates to the tracking system. We demonstrate the effectiveness of our approach, by means of experimental tests performed in a real outdoor installation and in a controlled indoor environment.