Exermote Gathering Data
Since the later learning procedure is supervised, labeled data is needed.
To record training data of different individuals I used 2 different types of devices:
- Iphone on right upper arm: 12 data features (3x gravity, 3x acceleration, 3x euler angle, 3x rotation rate)
- 6x Estimote Nearables on chest, belly, hands and feet: each 4 data features (3x acceleration, 1x rssi)
So there are 36 data features in total. The Nearables were reshaped by using Stewalin, a muffin form and some Velcro cable ties :)
Recording frequency was 10 Hz for the reason that Nearable send frequency is limited to this value on hardware side. Since the official SDK only allows to get Nearable acceleration data once per second, I had to access and decode advertisment data direcly via
CBCentralManager. Many thanks to reelyactive for inspiration.
Before recording was started a 5 minute training consisting of "burpees", "squats", "situps", "set breaks" and "breaks" had been generated randomly.
|time||exercise type||exercise sub type||36 data feature columns ...|
|0.1 s||set break||set break|
|0.2 s||set break||set break|
|0.3 s||set break||set break|
|0.4 s||set break||set break|
|0.5 s||set break||set break|
|0.6 s||burpee||first half|
|0.7 s||burpee||first half|
To ensure that exercising individuals trained accordingly to the pre-generated training and therefore labels matched perfectly to recorded movement data, the app gave spoken hints which exercise will follow. Additionally there was a generated whistle, whose pitch decreased during first half and increased during second half of an exerecise repetition.
The raw data contained 3 hours (=108000 data points) of 6 individuals and was saved to my iCloud drive, when recording was finished.