Interactive real-time computer vision experiments built with Python. The application demonstrates face tracking, hand tracking, gesture-based control, augmented reality overlays, and air drawing using a webcam.
Real-time facial landmark tracking using MediaPipe. The system detects key facial points and visualizes them directly on the video stream.
Tracks finger positions and hand structure with high accuracy.
A transparent Doctor Strange–style visual effect follows the detected hand in real time, creating a simple augmented reality interaction.
Control your computer using hand gestures:
- Move cursor
- Click actions
- Touchless interaction
Turn your index finger into a digital brush:
- Draw in real time
- Clear the canvas
- Smooth motion tracking
- Python
- Streamlit
- MediaPipe
- OpenCV
- PyAutoGUI
Computer-Vision-Experiments-Lab
│
├── src
│ ├── main.py
│ ├── facelandmark.py
│ ├── hand_landmark_mask.py
│ ├── mouse_control.py
│ └── drawing_mode.py
│
├── models
│ ├── hand_landmarker.task
│ └── face_landmarker.task
│
├── assets
│ ├── dr_strange_hand.png
│ └── demo.gif
│
├── requirements.txt
└── README.md
git clone https://github.com/yourusername/Computer-Vision-Experiments-Lab.git
cd Computer-Vision-Experiments-Lab
pip install -r requirements.txtstreamlit run src/main.pyOpen the local Streamlit address shown in the terminal.
Allow camera access when prompted.
Used Models
- Launch the application.
- Select a mode from the interface.
- Position yourself in front of the webcam.
- Try different gestures.
Available modes:
- Face Landmark Detection
- Hand Landmark Mask
- Hand Mouse Control
- Drawing Mode
- Gesture shortcuts
- Multiple brush colors
- Adjustable brush size
- Performance optimization
- Additional AR effects
YamenRM AI Engineering Student