Skip to content
An open source savigation device for the visually impaired and blind – based on the findings in sensory substitution
HTML C++ Java Processing Other
Branch: master
Clone or download
janostondar
janostondar 0.0.4
Latest commit d858894 Sep 29, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Android_UnfoldingMonitor 0.02 Sep 28, 2019
Arduino_Digispark-Poti 0.02 Sep 28, 2019
images 0.0.4 Sep 29, 2019
.gitignore 0.03 Sep 28, 2019
Makefile 0.02 Sep 28, 2019
README.md 0.0.4 Sep 29, 2019
camera.cpp 0.02 Sep 28, 2019
camera.hpp 0.02 Sep 28, 2019
glove.cpp 0.02 Sep 28, 2019
glove.hpp 0.02 Sep 28, 2019
init.cpp 0.02 Sep 28, 2019
init.hpp 0.02 Sep 28, 2019
libroyale 0.02 Sep 28, 2019
main.cpp 0.02 Sep 28, 2019
poti.cpp 0.02 Sep 28, 2019
poti.hpp 0.02 Sep 28, 2019

README.md

Unfolding space

The “Unfolding Space” glove is a prototypical open source navigation aid system for the visually impaired.

Prototype in Action

What is it about?

The project deals with sensory substitution, a phenomenon by which the function of one missing or dysfunctional sensory modality is replaced (substituted) by stimulating another one. The outcome is a prototype that projects a 3D picture, generated by a special camera, as vibration patterns on the back of the hand. Visually impaired people can therefore use their tactile modality of the hand to actually see.

More information can be found here:

What parts are needed?

This docuement only deals with the software/code. To test or to use the system, you would need to build the hardware yourself:

  • The Computing Unit: custom 3D printed casing containing battery pack and Raspberry Pi. (~ 100€)
  • The Glove itself: containing a custom made PCB and 9 vibration motors. (~ 100€)
  • 3D ToF Camera: Pico Flexx ToF Camera by pmdtec. (~ 300€)

As you can see certain costs and efforts are needed to build and assamble all parts. You can find build instructions, files, projects logs and more on my Hackaday project page.

I tried to document everything as good as I can, but as the development is still ongoing, things change very week. If you plan to build your own device, drop me a line and I can assist as good as I can.

Contents of this Readme

This readme focuses on the software including:

  • Main Code: C++ Scripts to compile and run on a Raspberry 3b+
  • Processing.org Sketch for an Android app monitoring the Raspi's activities
  • Arduino Sketch to run on a Digispark to read out analog potentiometer values on a Raspi

Not included (due to license restrictions) is the libroyale library for the Tof camera pico flexx, which you can download, when you own a device: https://pmdtec.com/picofamily/software-download/

Main Code for the Raspberry Pi 3b+

The Code is written in C++ as it was the only supported language by the Pico Flexx library at the time I started the project. I'm an Interaction Designer with little experience in software development and I'm bloody new to Github and to code conventions in general. I hope you forgive me for my amateurish code, file structure and documentation. Feel free to comment and to help me, if you see things that can be improved!

Installation

You need to install several libraries before you can actually build and run the main code. I will post a detailled guide, but for now you have to look up and install the needed libraries yourself – a makefile is included already.

Usage

Once compiled and runing, the code analyses and processes the 3D data coming from the ToF camera and drives the vibration motors via i2C and the custom made PCB so that the user can feel the 3D image on the skin. The code prints debugging information in the console and broadcasts them via UDP in the network. If you run the Android App on a smartphone in the same network, you should be able to control and monitor the raspi.

Files

The Code consists out of 5 files:

  • main.cpp containts the startup and initialization processes and runs the endless loop
  • init.cpp only holds some init functions called by main.cpp
  • camera.cpp receives the 3D frames from the ToF camera, processes them and feeds the data into the glove
  • glove.cpp contains functions for setting up the motors and sending the runtime data to the motor board.
  • poti.cpp reads the position of a potentiometer connected to a Digispark Arduino

Processing.org Sketch

To run the enclosed Processing Sketch on your phone you have to go to Processing.org, download the Processing IDE and install it on your computer. Next you connect your phone via USB cable to the computer and then you can run the sketch directly on your phone. The newly created app will remain on your phone afterwards.

Arduino code

As a Raspberry doesn't have a D/A conerter on board, you need a small Arduino or microcomputer attached to it. I used a Digispark equipped with a potentiometer, as they are exremely small and affordable. You only have to upload the attached Arduino Code and maybe adjust the Serial Port. The Digispark will then send the current position to the Raspi whenever there is a change.

Support and Contributing

I am urgently looking for contributors!

So far this project is developed only by me myself. I did not seet up chat rooms, issue tracker or anything else, as I'm not familiar with it and as I do not have and contributors. If you want to join the project, just contact me (here on Github, on Hackaday.io or via my homepage), and we will see how we can solve this.

You can’t perform that action at this time.