Skip to content

Latest commit

 

History

History
16 lines (11 loc) · 1.4 KB

README.md

File metadata and controls

16 lines (11 loc) · 1.4 KB

Navicane

Navicane aims to enable visually impaired people to navigate better in the outside environment by providing -

  • Real-time Navigation Guidance - The user receives haptic and audio feedback for turn-by-turn navigation from and to frequented destinations
  • Depth-estimation - The cane is equipped with ultrasonic sensors to detect sudden changes in the ground elevation, i.e., stairs and potholes, to alert the user about the path ahead
  • Obstacle avoidance - An inbuilt camera in the cane detects approaching obstacles and provides the user with haptic feedback accordingly.

The cane has been designed over multiple rounds, taking feedback from the visually impaired community, to make the product more user-friendly.

We developed a working prototype (MVP) of the cane and gave a live demonstration. Here is a pitch video demonstrating our idea.

Technology

The scripts are written in Python and are executed on a Raspberry Pi onboard the cane. The circuits are powered by a typical portable powerbank, which can be easily charged.

Future work

The cane can be integrated with a smartphone app to further enhance the capabilities of the cane. The cane can use the phone's computation power to perform more operations, and possibly execute advanced computer vision algorithms to "see" the environment with greater detail.