Translating hand poses and gestures into sound
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
evaluate
puredata
pysonic-gesture
sonicgesture
thesis
README.md

README.md

About

Sonic Gesture is a Computer Vision project where learned hand poses are translated into corresponding commands with the aim of generating sound. In other words this software makes it possible to create and manipulating sound without physical interaction with a computer.

This is the graduation project of Gijs Molenaar for the Master Artificial Intelligence at the University of Amsterdam. You can read find the thesis here.

Installation

Unfortunatly I didn't have time to update this project in a while, i'm not even sure if it still compiles. the sonicgesture folder contains the C++ code which used cmake, QT and opencv.

other folders:

  • evaluate: contains performance evaluation data
  • puredata: contains a puredata toy patch
  • pysonic-gesture: a early python based prototype
  • sonicgesture: the C++ source code

Examples

Sonic Gesture in an example performance setting:

http://www.youtube.com/watch?v=GbB5jZm3ROw

http://www.youtube.com/watch?v=scUl1haoOfk&ap=%2526fmt%3D22