Automatic Recognition of Pointing Gestures via an Artificial Neural Network
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
dataset
res
src
.gitignore
LICENSE
README.md

README.md

Pointing gesture recognition

Automatic Recognition of Pointing Gestures in Python using a depth sensor.

Requirements

Getting started

  • Open a terminal and execute the main.py file (CLI-only version: live.py).
  • Stand up and point!

Live GUI

Performances and Accuracy

The average pointed direction is 18.6 cm off the target, representing an error angle of 5.3°.

Impacts heatmap

The current neural network features the following success percentages:

  • 98.50 % on training
  • 88.10 % on testing
  • 88.10 % on validation

Configuration

  • Auto-calibration needs to be enabled in the OpenNI file FeatureExtraction.ini by uncommenting and setting to 1 the line UseAutoCalibration=1 under the [LBS] section.

Advanced

Record new dataset items

  • Open a terminal and execute the capture.py file.
  • Fill the GUI form and shoot!

Train the network

  • Open a terminal and execute the training.py file.
  • You can choose what kind of data you will input alongside the network's parameters.

Validate the network

  • Open a terminal and execute the validating.py file.
  • Same choice than the training part herein.

Check accuracy

  • Open a terminal and execute the accuracy.py file.
  • Get the average pointed direction accuracy.
  • Get a graphic of all trajectories or all impacts relative to the target thanks to matplotlib as illustrated in the performance section.

About

This experimentation is part of my Software Engineering Master's dissertation for Oxford Brookes University and is meant to be used on their RoboThespian unit.