This repo contains the source code used in my project using dancing to modify my vision. Check out the post more details on the experiment
This Python script collects movement data from 4 Lilypad analog accelerometers, sending events over a websocket to the VR headset. It targets a Raspberry Pi that is using a MCP3008 analog to digital converter for the Pulse Sensor.
The code requires Python 3.5+, and the RPi.GPIO, MCP3008, and webksockets libraries.
To start it, simply run:
$ python3 sensor/collector.py eth0
The script arguments specify which network devices to serve the websocket on.
The website is designed to be run on an iPhone used with Google Cardboard. It takes the mjpeg stream from the camera and the heartbeat events, and uses WebGL to modify your vision in realtime.
The site uses webpack. To run it:
$ cd viewer
$ npm install
# Edit `src/config.js` to provide the expected ip address of the Raspberry pi
# and rebuild
$ webpack
# server up index.html somehow
$ http-server index.html
Note: You have to provide your own mp3 music files for some of these.