Machine Learning Project on Smartphone Activity Detector We built some machine learning models to recognize the human activity with the help of sensor readings of the phone being carried.
all_models.ipynb is the main file where all the models are compiled.
Source of the dataset was UCI Machine Learning Repository. Data set built from the recordings of 30 subjects performing basic activities and postural transitions while carrying a waist-mounted smartphone (Samsung Galaxy S II) with embedded inertial sensors (accelerometer and gyroscope).
Dataset was already cleaned, scaled and split into training and testing data. As features the dataset has 561 datapoints for each instance of activity.
And as labels, activities composed of six basic activities: three static postures (standing, sitting, lying) and three dynamic activities (walking, walking downstairs and walking upstairs). The experiment also included postural transitions that occurred between the static postures. These are: stand-to-sit, sit-to-stand, sit-to-lie, lie-to-sit, stand-to-lie, and lie-to-stand. These all activities were labeled with a number as following:
- WALKING
- WALKING_UPSTAIRS
- WALKING_DOWNSTAIRS
- SITTING
- STANDING
- LAYING
- STAND_TO_SIT
- SIT_TO_STAND
- SIT_TO_LIE
- LIE_TO_SIT
- STAND_TO_LIE
- LIE_TO_STAND
We created some models with scikit learn and tensorflow keras. Out of all the models, support vector machine linear and deep neural network seem promising with test accuracy of 95.2% and 94.3% respectively.