AirFlow is a project that utilizes hand gestures for controlling and interacting with digital content. It employs computer vision techniques to track hand movements captured by a webcam and translates them into various actions, such as drawing on a canvas or controlling applications.
- Hand gesture recognition using Mediapipe framework
- Real-time interaction with digital content
- Supports multiple colors and drawing modes
- Clear button for resetting the canvas
- Easy setup and usage
-
Clone the repository:
git clone [https://github.com/](https://github.com/MadhumithaKolkar/AirFlow.git)
-
Install the required packages:
python -m pip install opencv-python numpy mediapipe
- Run the Python script:
python AirFlow.py
-
Use hand gestures to interact with the application:
- Move your hand to draw on the canvas
- Change colors by selecting different regions on the screen
- Clear the canvas by pressing the "CLEAR" button
- Exit the application by pressing the "q" key
Contributions are welcome! Here's how you can contribute:
- Fork the repository
- Create your feature branch (git checkout -b feature/your-feature)
- Commit your changes (git commit -am 'Add some feature')
- Push to the branch (git push origin feature/your-feature)
- Create a new Pull Request
- Mediapipe - Hand tracking framework
- OpenCV - Computer vision library
- Madhumitha Kolkar
- This project is licensed under the MIT License - see the LICENSE file for details.