American Sign Language (ASL) Recognition using Hand Landmarks
This is the source code of the article: Classifying American Sign Language Alphabets on the OAK-D
On a Raspberry Pi 4B, run in terminal:
git clone https://github.com/cortictechnology/hand_asl_recognition.git cd hand_asl_recognition bash install_dependencies.sh
- Make sure the OAK-D device is plug into the Pi.
- In the terminal, run
By default, ASL recognition is enabled.
In the models folder, 3 models are provided:
- palm_detection_6_shaves.blob: This is the palm detection model. Converted using OpenVino's myriad_compiler.
- hand_landmark_6_shaves.blob: This is the model to detect the hand landmarks using the palm detection model. Converted using OpenVino's myriad_compiler.
- hand_asl_6_shaves.blob: This is the model to classify the hand's gesture into ASL characters. Converted using OpenVino's myriad_compiler.
To train your own ASL recognition (or any gesture classification model)
Please refer to the training script in the training folder. We have provided all of the data we used for training the ASL recognition model. You can change the data or modify the training script to train your own model. The training script will save the trained model into a frozen PB model, which can then be converted to run on the OAK-D hardware using OpenVino's mo.py script and myriad_compiler.