Real-time guidance for visually impaired users
Publish Time: 26 Nov, 2025

Researchers at Penn State have developed a smartphone application, NaviSense, that helps visually impaired users locate objects in real time using AI-powered audio and vibration cues.

The tool relies on vision-language and large-language models to identify objects without preloading 3D models.

NaviSense incorporates feedback from visually impaired users to offer conversational search and real-time hand guidance, improving flexibility and precision compared to existing visual aid solutions.

Tests showed it reduced search time and increased detection accuracy, with users praising the directional feedback.

The development team continues to optimise the application's battery use and AI efficiency in preparation for commercial release. Supported by the US National Science Foundation, NaviSense represents a significant step towards practical, user-centred accessibility technology.

I’d like Alerts: