Real-Time Computer Vision System for Palm Orientation, Gesture Analytics & Interactive UI Rendering. This project implements a high-performance real-time hand-tracking system that calculates palm rotation angle using pure NumPy vector math, combined with MediaPipe and OpenCV for hand detection and visualization.
It includes a futuristic HUD (Heads-Up Display) with animated radial UI, fingertip gears, palm openness metrics, 21-point hand skeleton styling, and a gesture-aware cube/grid visualization. This repository serves as a complete example of integrating vector geometry, signal smoothing filters, 3D-like visual effects, and Computer Vision pipelines in Python.
✔ Real-time Hand Tracking (21 Landmarks)
✔ Palm Rotation Angle (0°–360°) using NumPy
✔ Openness % based on fingertip–palm distances
✔ Dynamic Animated Radial UI & Gears
✔ Exponential Smoothing (No Jittering!)
✔ Fully Modular Code (Overlay + Utils + Main loop)
✔ Works Smoothly at 25-40 FPS
Palm rotation is computed using vector orientation math:
- Use wrist landmark (0)
- Use middle finger MCP (9)
- Create a vector:
[ v = m-w ] - Compute angle using
atan2(y, x) - Normalize to 0°–360°
def compute_palm_rotation(landmarks):
w = np.array(landmarks[0][:2])
m = np.array(landmarks[9][:2])
v = m-w
angle = np.degrees(np.arctan2(v[1], v[0]))
return float(angle % 360)