Skip to content

khanhdq109/Pipeline-for-Hand-Gesture-Recognition

Repository files navigation

Pipeline for Hand Gesture Recognition

Develop a real-time dynamic hand gesture recognition system using the HaGRID and JESTER datasets. It includes detection and classification components for accurate performance. Below are the project structure, setup, and execution details.

Project structure

Dataset

Detection

In this project, we use HaGRID dataset for detection task. You can download the version of the dataset for YOLO here:

Classification

For classification task, we use JESTER dataset. You can download it here:

Setup

Setup directories: Hand_Gesture/src.
After clone this repository in src, run this command to install requirement packages, download and setup dataset:

./setup.sh

CAUTION: You must have at least 70GB of memory available.

Execute

To train, evaluate models as well as run the program, you have to manually tune parameters from the source code.

Detection

Train:

python detect_train.py

Evaluate (image or video mode):

python detect_eval.py

Classification

Run this command to train model as well as save metrics:

You can set small_version = True in classify_train.py to run a demo with a small version of the dataset.

./train.sh

Evaluate:

python classify_eval.py

Run the model with real-time camera:

python program.py

Author

Quoc Khanh