Sonic Gesture is a Computer Vision project where learned hand poses are translated into corresponding commands with the aim of generating sound. In other words this software makes it possible to create and manipulating sound without physical interaction with a computer.
Unfortunatly I didn't have time to update this project in a while, i'm not even sure if it still compiles. the
sonicgesture folder contains the C++ code which used cmake, QT and opencv.
- evaluate: contains performance evaluation data
- puredata: contains a puredata toy patch
- pysonic-gesture: a early python based prototype
- sonicgesture: the C++ source code
Sonic Gesture in an example performance setting: