OpenGuide-Assist is a wearable assistive device based on the Intel OAK-D camera. The goal is to leverage RGB and depth data in order to detect obstacles and convey them to a person with visual impairments. The current version supports 3D obstacle and vehicle part detection models, which are used to provide audio-haptic guidance. For audio, a bluetooth connection is supported. For haptic, please see our design below involving a Raspberry Pi module.
Documentation for setup and installation is available at DepthAI API Docs.
This project was part of a Boston University Senior Design Project.