This project uses computer vision to detect and classify hand gestures captured via a webcam. The project consists of two main components: data collection for creating a custom dataset of hand gestures and real-time classification of these gestures using a pre-trained model.
This project provides tools to:
- Collect images of hand gestures for training a machine learning model.
- Detect hand gestures in real-time and classify them into predefined categories using a deep learning model.
- Hand Detection: Detects hands in the video feed using OpenCV and cvzone's
HandDetector. - Dataset Creation: Saves cropped hand images for creating a custom dataset.
- Gesture Classification: Classifies gestures into categories (e.g., "A", "B", "C", "OK") using a pre-trained Keras model.
- Real-Time Feedback: Displays predictions and bounding boxes around detected hands on the live feed.
The project requires the following Python libraries:
cv2(OpenCV)cvzonenumpymath
Install these dependencies using pip:
pip install opencv-python cvzone numpy-
Clone the repository:
git clone https://github.com/VRP-github/HandSign-Detection.git
-
Ensure the following folders and files are present:
Images/for storing collected gesture images.Model/keras_model.h5for the pre-trained classification model.Model/labels.txtfor gesture labels.
-
Connect a webcam to your system.
- Run the
dataCollection.pyscript to collect gesture images:python dataCollection.py
- Make gestures in front of the camera. Press
sto save images of your gesture into theImages/folder. - Use the saved images to train your custom gesture classification model.
- Ensure the pre-trained model and labels file are present in the
Model/folder. - Run the
test.pyscript to start gesture classification:python test.py
- Wave your hand in front of the camera to see real-time predictions displayed on the screen.