Thesis F2013-S2014
Princeton University
Advisor: Professor Naveen Verma
These are the final relevant files that may be useful to developing this project further.
They are the result of several iterations on different mechanisms of interaction/gesture recognition/position localization etc. I've done my best to reduce the code down to something simple and small for future applications.
For a demonstration of the system, please check out the following youtube videos:Raw System Readout: https://www.youtube.com/watch?v=tzz5eXb3yZc
Position Localization on ITO Display: https://www.youtube.com/watch?v=5hDlvL9fHXY
Gesture Recognition + Integrated System: http://youtu.be/z-SjWKPr0Zg (In video, I am flipping through Keynote slides using right swipe, left swipe and tap gestures. Note the translation invariance.)
The written report is in .pdf form at....TBD
This system detects small changes in capacitance created by the presence of a human hand. The sensors are arranged in a 2D grid with ~15cm between parallel sensors. It can localize position up to about ~20 cm away for copper sensors and ~15 cm away for ITO sensors. It can also detect proximity from about 30cm away.The system uses dynamic time warping for some simple gesture recognition. This algorithm was chosen because it is simple to experiment with different types of gestures and to add new gestures to our dictionary.
Thanks.
Abbreviations:
- df = deltaf = change in frequency created because of human presence
- adc = the digital values received from the board (adc = analog to digital conversion)