Skip to content

nightvision04/simple-gesture-tracking

Repository files navigation

Quick Install

Install the library using Python's package manager pip.

pip install head-controller

Purpose

Predict your webcam gestures in realtime!

Quickly train 4 gestures for the model to learn. Press the UP, DOWN, RIGHT, and LEFT arrows on your keyboard to 'label' each gesture in realtime. After 30 seconds you'll be prompted to save (append) the new training data. A cross-validation score of the fitted data will be displayed. This model doesn't use convolution. It's intended for fixed camera & fixed lighting setups.


Above - Example of 4 distinct gesture inputs during training.


Above - Live prediction would output 'Gesture 1'.

Requirements
  • Anaconda Python >= 3.5
Manual Installation
conda create --name head python=3.7
conda activate head
# Navigate to the head_controller directory
python setup.py install

Basic Usage

Import dependencies and start training from your webcam:

import head_controller.db as db
import head_controller.Camera as Camera

# Initialize gesture training data
db.setup_db()

# Capture webcam gestures with live arrow-key labelling.
# Hold DOWN, UP, RIGHT, or LEFT keys while gesturing into the camera.
Camera.capture_review_submit_labels()

# Realtime predict your webcam gestures.
Camera.check_video_frame_data_predict()

To append more training samples, simply run the following script over and over:

Camera.capture_review_submit_labels()
Future Updates
  • Add class for continuously updating the db with live gesture predictions.
  • Add api for accessing live gestures from other programs.

Author:

If you're interested in adding to this library or using it for a project - I would love to hear from you.

About

A simple script for non-convolution based gesture modelling

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published