Indoor Navigation Glasses for the Visually Impaired with Deep Learning and Audio Guidance Using Google Coral Edge TPU

Pocholo James M. Loresco
a
,
Rence Jerome C. Cruz
a
,
Kingsley Z. Ramones
a
,
Julia Angellica D. Zafra
a
,
Karl Russell G. Ramirez
a
a Electronics Engineering, FEU Institute of Technology, Manila, Philippines
Abstract: Visual impairment continues to be a global health concern. People with visual impairment experience difficulty moving around indoors, especially in unfamiliar spaces. While existing assistive technologies like smart canes offer point-to-point navigation or rely on infrastructure like RFID tags or beacons, they lack the ability to provide comprehensive indoor navigation with obstacle detection and avoidance. This paper presents a novel indoor navigation system for visually impaired individuals using deep learning and audio guidance. The system utilizes 3D-printed glasses equipped with a Raspberry Pi v2 camera, audio user interface and a processing unit comprising a Raspberry Pi 4B and Google Coral Edge tensor processing unit (TPU). As validated in a controlled indoor environment, the deep learning models for localization, navigation, obstacle detection, and obstacle avoidance achieve high results in terms of accuracy, precision recall, and F1-score. Based on user tests using the System Usability Scale, this wearable assistive device appears to offer a promising solution for promoting independent navigation and spatial awareness among visually impaired individuals.