Develop a hand gesture recognition model that can accurately identify and classify different hand gestures from image or video data, enabling intuitive human-computer interaction and gesture-based control systems. All the steps that are required for the experiment such as data preparation, model building, training, evaluation and saving are provided in the IPython notebook: PRODIGY_ML_04.ipynb
. The model inference was done on app.py
and live_detection.py
-
Clone this repository.
git clone https://github.com/surajkarki66/PRODIGY_ML_04
-
Create a Python virtual environment and activate the environment based on your machine(Linux, MacOS, and Windows)
-
Download the trained model from here and put it into the project root directory.
-
Install the requirements
pip install -r requirements.txt
-
Run the following command To run the normal demo:
python app.py
To run the live detection demo:
python live_detection.py
Happy coding!