Skip to content
An example project that demonstrates the problem of human activity recognition (HAR) given mobile phone sensor data (Gyroscope, Accelerometer, etc.) using a Softmax Classifier and random projection.
PHP
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
test Initial commit Sep 16, 2018
train Initial commit Sep 16, 2018
.gitattributes Updated to Rubix ML 0.0.10-beta Mar 10, 2019
.gitignore Update to Rubix ML beta Feb 9, 2019
README.md Updated link to docs Jun 26, 2019
composer.json Update to Rubix ML 0.0.14-beta Jul 29, 2019
explore.php Update to Rubix ML 0.0.14-beta Jul 29, 2019
features.txt
features_info.txt Initial commit Sep 16, 2018
har.model.1558237068.old Update to Rubix ML 0.0.12-beta May 19, 2019
train.php Drop memory limit Jun 20, 2019

README.md

Human Activity Recognizer

This is an example project that demonstrates the problem of human activity recognition (HAR) given mobile phone sensor data (Gyroscope, Accelerometer, etc.). The training data are the human annotated sensor readings of 30 volunteers while performing various tasks such as sitting, standing, and laying down. Each sample is 561 dimensional, however we demonstrate that with a technique called random projection we can reduce the dimensionality down to 120 without any loss in accuracy. The estimator employed to make the predictions is a Softmax Classifier which is a multiclass generalization of the Logistic Regression classifier used in the Credit Card Default Predictor example project.

  • Difficulty: Medium
  • Training time: < 5 Minutes
  • Memory needed: < 1G

Installation

Clone the repository locally using Git:

$ git clone https://github.com/RubixML/HAR

Install dependencies using Composer:

$ composer install

Requirements

  • PHP 7.1.3 or above

Project Description

The experiments have been carried out with a group of 30 volunteers within an age bracket of 19-48 years. Each person performed six activities (WALKING, WALKING_UPSTAIRS, WALKING_DOWNSTAIRS, SITTING, STANDING, LAYING) wearing a smartphone (Samsung Galaxy S II) on the waist. Using its embedded accelerometer and gyroscope, we captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz. The experiments have been video-recorded to label the data manually. The obtained dataset has been randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data.

The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details.

Tutorial

On the map ...

Original Dataset

Contact: Jorge L. Reyes-Ortiz(1,2), Davide Anguita(1), Alessandro Ghio(1), Luca Oneto(1) and Xavier Parra(2) Institutions: 1 - Smartlab - Non-Linear Complex Systems Laboratory DITEN - University degli Studi di Genova, Genoa (I-16145), Italy. 2 - CETpD - Technical Research Centre for Dependency Care and Autonomous Living Universitat Polit�cnica de Catalunya (BarcelonaTech). Vilanova i la Geltr� (08800), Spain activityrecognition '@' smartlab.ws

References:

[1] Davide Anguita, Alessandro Ghio, Luca Oneto, Xavier Parra and Jorge L. Reyes-Ortiz. A Public Domain Dataset for Human Activity Recognition Using Smartphones. 21th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2013. Bruges, Belgium 24-26 April 2013.

You can’t perform that action at this time.