Detecting gestural inputs on surfaces
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
MultiTap
MultiTap2
MultiTap_timer
MultiTap_timer2
Thresholds
VolumeCircles
beatDetect2
beatTest
doubletap_arpit
README.md

README.md

##SurfaceTouch We imagine the ability to affordably make every surface an interactive surface, responding to touch input. The aim is to make a device that can recognize a plethora of gestures on a variety of surfaces, and interface these comprehended gestures to execute operations on other devices. This will enable the nearest wall, table top, your clothes, even your teeth, to accept touch input, which can control your cell phone's calls, music player, presentation's slides, et al.

##Current Status At this beginning stage, the prototype takes input from the default lineIn (the microphone input), and processes that. The only gestures distinguished amongst, are different sets of taps. There are a few shortcomings in the prototype code, and the different folders are lots of different approaches to solve the problems we faced. The closest to functioning instance of code, is MultiTap_timer.

##Making This was prototyped in a period of three days at the MIT DI (Design Innovation) 2013 workshop at the PES Institute of Technology, Bangalore. The team involved, comprised of Arpit Agarwal, Pallavi Gupta, Rishika Jain, Somya Mehdiratta, and Vishesh Kumar.