The Gesture Classifier is a streamlit application that provides an easy interface to classify hand gestures in images. It makes use of the multimodal pipeline that includes computer vision models to detect persons in images, identify keypoints for pose estimation, and classify gestures based on the detected poses.
demo_video.mp4
Before using the Gesture Classifier, make sure you have the required software and packages installed. You can install the packages using the provided requirements.txt
file:
pip install -r requirements.txt
-
Clone the repository:
git clone https://github.com/rigvedrs/Red-Hen-Gesture-Classifier.git cd Red-Hen-Gesture-Classifier
-
Run the Streamlit application:
streamlit run run_pipeline.py
-
Run the Streamlit application as described in the installation section.
-
Upload an image or select a predefined test image.
- Click the "Classify Poses" button to perform gesture classification.
- Click the "Display Images" button to view the results.
- Detected gestures and images will be displayed with their corresponding classifications.
A set of predefined test images is included in the "test_imgs" folder for your convenience.
This project is licensed under the MIT License - see the LICENSE file for details.