Advent of wearable computing devices has opened a myriad of opportunities to explore the interaction modalities of these devices. The interaction techniques have been addressed using various gesture sets, and which are still evolving as we see the advancement in technologies every day. These gesture sets are customized, enhanced, developed, extended or optimized according to the wearable devices being developed. In this work, a new gesture recognition system, SmartGest, is proposed to that recognize gestures which can be possibly performed on a smartwatch using same-hand on which the watch is worn. The gesture recognition will be developed using machine learning algorithms and the data from various sensors and engines within the wearable e.g. accelerometer, gyroscope, infrared sensors, taptic-engines, microphone, photosensors, visible-light LEDs (available by default in recent day smartwatches).
[Android Sensors Overview] (http://developer.android.com/guide/topics/sensors/sensors_overview.html)
[Android Sensor Types - Overview] (https://source.android.com/devices/sensors/sensor-types.html)
[Guide to Smartwatch app development from Sensors and API perspective] (https://github.com/nathan5x/SmartGest/blob/master/Designs/SmartwatchSensors_Guide_v0.2.pdf)