Skip to content

Using Machine Learning to determine Human Activity using data from 6 d.o.f. IMU sensor.

Notifications You must be signed in to change notification settings

cutting-edge-visionaries/Human-Activity-Recognition-ML

Repository files navigation

Human-Activity-Recognition-ML

This dataset is an extended version of the UCI Human Activity Recognition Using smartphones Dataset that can be found at:

Using Machine Learning to determine Human Activity using data from 6 d.o.f. IMU sensor.

The experiments were carried out with a group of 30 volunteers within an age bracket of 19-48 years. They performed a protocol of activities composed of six basic activities: three static postures (standing, sitting, lying) and three dynamic activities (walking, walking downstairs and walking upstairs). The experiment also included postural transitions that occurred between the static postures. These are: stand-to-sit, sit-to-stand, sit-to-lie, lie-to-sit, stand-to-lie, and lie-to-stand. All the participants were wearing a smartphone (Samsung Galaxy S II) on the waist during the experiment execution. We captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz using the embedded accelerometer and gyroscope of the device. The experiments were video-recorded to label the data manually. The obtained dataset was randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data.

The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of 561 features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details.

This dataset is an extended version of the UCI Human Activity Recognition Using smartphones Dataset that can be found at: https://archive.ics.uci.edu/ml/datasets/Human+Activity+Recognition+Using+Smartphones This version provides the original raw inertial signals from the smartphone sensors, instead of the ones pre-processed into windows which were provided in version 1. This change was done in order to be able to make online tests with the data. Moreover, the activity labels were updated in order to include postural transitions that were not part of the previous version of the dataset.

The dataset is then divided in two parts and they can be used separately.

  1. Inertial sensor data
  • Raw triaxial signals from the accelerometer and gyroscope of all the trials with with participants.
  • The labels of all the performed activities.
  1. Records of activity windows. Each one composed of:
  • A 561-feature vector with time and frequency domain variables.
  • Its associated activity label.
  • An identifier of the subject who carried out the experiment.

The dataset includes the following files:

  • 'README.txt'

  • 'RawData/acc_expXX_userYY.txt': The raw triaxial acceleration signal for the experiment number XX and associated to the user number YY. Every row is one acceleration sample (three axis) captured at a frequency of 50Hz.

  • 'RawData/gyro_expXX_userYY.txt': The raw triaxial angular speed signal for the experiment number XX and associated to the user number YY. Every row is one angular velocity sample (three axis) captured at a frequency of 50Hz.

  • 'RawData/labels.txt': include all the activity labels available for the dataset (1 per row). Column 1: experiment number ID, Column 2: user number ID, Column 3: activity number ID Column 4: Label start point (in number of signal log samples (recorded at 50Hz)) Column 5: Label end point (in number of signal log samples)

  • RawData/remove noise from raw data.py : The code in the file is used to remove the noise from the original raw data, and then create cleaned dataset of x.csv and y.csv

  • 'features_info.txt': Shows information about the variables used on the feature vector.

  • 'activity_labels.txt': Links the activity ID with their activity name.

  • cleaned data/x"".csv files are cleaned dataset after removing noise. Cleaning is done using code in the file Raw Data/remove noise from raw data.py. The files are divided into x1, x2, x3, x4, x5 as combined of all of them is too large to upload on GitHub.

  • cleaned data/y.csv file is labels for the combined (x1, x2, x3, x4, x5) file.

  • Reseach Papers Studied/sensons''''.pdf : Are research papers used to study this project.

Notes:

  • The units used for the accelerations (total and body) are 'g's (gravity of earth -> 9.80665 m/seg2).
  • The gyroscope units are rad/seg.
  • A video of the experiment including an example of the 6 recorded activities with one of the participants can be seen in the following link: http://www.youtube.com/watch?v=XOEN9W05_4A

AUTHORS : Jaimin and Ummul

About

Using Machine Learning to determine Human Activity using data from 6 d.o.f. IMU sensor.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published