This is a jupyter notebook program to train and detect the hand gestures of american signlanguage using LSTM and Mediapipe
-
Updated
Mar 2, 2023 - Jupyter Notebook
This is a jupyter notebook program to train and detect the hand gestures of american signlanguage using LSTM and Mediapipe
This project utilizes computer vision techniques to recognize and interpret hand gestures, allowing for a more intuitive and natural way of interacting with computers. The project includes a Jupyter notebook that contains the code for detecting and tracking hand landmarks using the MediaPipe Hands library and OpenCV.
Add a description, image, and links to the mediapipe-hands topic page so that developers can more easily learn about it.
To associate your repository with the mediapipe-hands topic, visit your repo's landing page and select "manage topics."