This Flask application utilizes computer vision techniques to detect hand signs and provide real-time translation. The application leverages MediaPipe for hand landmark detection and a custom keypoint classifier to recognize hand signs. It also features two modes: one for learning gestures and another for translating them into text.
- Hand Sign Detection: Uses a webcam to capture hand movements and recognize specific hand signs.
- Gesture Learning: Allows users to learn different gestures.
- Sign Translation: Translates recognized signs into text.
- Real-time Video Streaming: Displays the camera feed with detected hand signs and relevant information.
- Python 3.x
- Flask
- OpenCV
- MediaPipe
- Other dependencies specified in
requirements.txt
-
Clone the repository:
git clone https://github.com/hammadali1805/SignLink.git cd SignLink -
Install the required packages:
pip install -r requirements.txt
-
Make sure to update any paths in the code if needed.
-
Start the Flask application:
python my_app.py
-
Open your web browser and navigate to
http://127.0.0.1:5000/. -
Use the following endpoints:
/- Home page/learn- Learning mode for hand signs/translate- Translation mode for hand signs/sign- Get the currently selected sign/detectedsign- Get the detected sign/video/<usecase>- Real-time video streaming (uselearnortranslateas the use case)
- Use keys
0-9to select different hand signs. - Press
nto switch to learning mode. - Press
kto switch to translation mode. - Press
hfor help or information about current modes. - Press
ESCto exit the application.
- MediaPipe: https://google.github.io/mediapipe/
- OpenCV: https://opencv.org/
- Other resources and libraries utilized in the project.
This project is licensed under the MIT License - see the LICENSE file for details.