A Vibrotactile Navigation Aid for the Visually Impaired
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.



Bachelor Thesis at the Media Computing Group, Computer Science Department @ RWTH Aachen University (Germany) Author: David Antón Sánchez Project Website: https://hci.rwth-aachen.de/openvnavi


###System Description

OpenVNAVI is a vest equipped with a depth sensor and array of vibration motor units that allow people with visual impairment to avoid obstacles in the environment.

The ASUS Xtion PRO LIVE depth sensor, positioned onto the user’s chest scans the environment as the user moves. From the video feed of the depth sensor a frame is captured and then processed by the Raspberry Pi 2. Each frame is downsampled from 640x480 to 16x8 and each pixel is then mapped to a vibration motor unit forming an array positioned onto the user’s belly.

The grayscale value of each pixel on the lower resolution frame is assigned to a PWM voltage value generated by the Raspberry Pi 2 via PWM drivers that will drive each vibration motor obtaining a vibration amplitude value as a function of the proximity of an object.

With this method the vibration motor unit array is able to represent a vibratory image onto the user’s belly to help create a mental representation of the obstacles in the scene.