The “Unfolding Space” glove is a prototypical open source navigation aid system for the visually impaired.
What is it about?
The project deals with sensory substitution, a phenomenon by which the function of one missing or dysfunctional sensory modality is replaced (substituted) by stimulating another one. The outcome is a prototype that projects a 3D picture, generated by a special camera, as vibration patterns on the back of the hand. Visually impaired people can therefore use their tactile modality of the hand to actually see.
More information can be found here:
- Hackaday.io Project Page with project logs, files and instructions
- Project Website with more information, images and press texts
- Publications on Researchgate – my scientific papers (german only, sorry)
What parts are needed?
This docuement only deals with the software/code. To test or to use the system, you would need to build the hardware yourself:
- The Computing Unit: custom 3D printed casing containing battery pack and Raspberry Pi. (~ 100€)
- The Glove itself: containing a custom made PCB and 9 vibration motors. (~ 100€)
- 3D ToF Camera: Pico Flexx ToF Camera by pmdtec. (~ 300€)
As you can see certain costs and efforts are needed to build and assamble all parts. You can find build instructions, files, projects logs and more on my Hackaday project page.
I tried to document everything as good as I can, but as the development is still ongoing, things change very week. If you plan to build your own device, drop me a line and I can assist as good as I can.
Contents of this Readme
This readme focuses on the software including:
- Main Code: C++ Scripts to compile and run on a Raspberry 3b+
- Processing.org Sketch for an Android app monitoring the Raspi's activities
- Arduino Sketch to run on a Digispark to read out analog potentiometer values on a Raspi
Not included (due to license restrictions) is the libroyale library for the Tof camera pico flexx, which you can download, when you own a device: https://pmdtec.com/picofamily/software-download/
Main Code for the Raspberry Pi 3b+
The Code is written in C++ as it was the only supported language by the Pico Flexx library at the time I started the project. I'm an Interaction Designer with little experience in software development and I'm bloody new to Github and to code conventions in general. I hope you forgive me for my amateurish code, file structure and documentation. Feel free to comment and to help me, if you see things that can be improved!
You need to install several libraries before you can actually build and run the main code. I will post a detailled guide, but for now you have to look up and install the needed libraries yourself – a makefile is included already.
Once compiled and runing, the code analyses and processes the 3D data coming from the ToF camera and drives the vibration motors via i2C and the custom made PCB so that the user can feel the 3D image on the skin. The code prints debugging information in the console and broadcasts them via UDP in the network. If you run the Android App on a smartphone in the same network, you should be able to control and monitor the raspi.
The Code consists out of 5 files:
main.cppcontaints the startup and initialization processes and runs the endless loop
init.cpponly holds some init functions called by main.cpp
camera.cppreceives the 3D frames from the ToF camera, processes them and feeds the data into the glove
glove.cppcontains functions for setting up the motors and sending the runtime data to the motor board.
poti.cppreads the position of a potentiometer connected to a Digispark Arduino
To run the enclosed Processing Sketch on your phone you have to go to Processing.org, download the Processing IDE and install it on your computer. Next you connect your phone via USB cable to the computer and then you can run the sketch directly on your phone. The newly created app will remain on your phone afterwards.
As a Raspberry doesn't have a D/A conerter on board, you need a small Arduino or microcomputer attached to it. I used a Digispark equipped with a potentiometer, as they are exremely small and affordable. You only have to upload the attached Arduino Code and maybe adjust the Serial Port. The Digispark will then send the current position to the Raspi whenever there is a change.
Support and Contributing
I am urgently looking for contributors!
So far this project is developed only by me myself. I did not seet up chat rooms, issue tracker or anything else, as I'm not familiar with it and as I do not have and contributors. If you want to join the project, just contact me (here on Github, on Hackaday.io or via my homepage), and we will see how we can solve this.