Skip to content

rigvedrs/Red-Hen-Gesture-Classifier

Repository files navigation

Gesture Classifier 🚀

GitHub

GitHub last commit

The Gesture Classifier is a streamlit application that provides an easy interface to classify hand gestures in images. It makes use of the multimodal pipeline that includes computer vision models to detect persons in images, identify keypoints for pose estimation, and classify gestures based on the detected poses.

demo_video.mp4

Table of Contents

Getting Started

Prerequisites

Before using the Gesture Classifier, make sure you have the required software and packages installed. You can install the packages using the provided requirements.txt file:

pip install -r requirements.txt

Installation

  1. Clone the repository:

    git clone https://github.com/rigvedrs/Red-Hen-Gesture-Classifier.git
    cd Red-Hen-Gesture-Classifier
  2. Run the Streamlit application:

    streamlit run run_pipeline.py

Usage

Uploading an Image

  1. Run the Streamlit application as described in the installation section.

  2. Upload an image or select a predefined test image.

Classifying Poses

  • Click the "Classify Poses" button to perform gesture classification.

Displaying Results

  • Click the "Display Images" button to view the results.
  • Detected gestures and images will be displayed with their corresponding classifications.

Predefined Test Images

A set of predefined test images is included in the "test_imgs" folder for your convenience.

License

This project is licensed under the MIT License - see the LICENSE file for details.


About

Final Pipeline for Gesture classification

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages