Real-time head-tracked 3D display that creates a depth illusion on a flat monitor using webcam tracking and dynamic OpenGL perspective rendering.
An experimental computer vision project that creates the illusion of depth on a normal flat monitor using real-time head tracking and perspective-correct OpenGL rendering.
This system tracks the user’s head position with a webcam and dynamically adjusts the camera perspective, producing a pseudo-3D parallax effect without special hardware.
If you do not have a physical second monitor, you can use a virtual display solution such as GlideX.
- A webcam captures the user’s face in real time.
- MediaPipe detects facial landmarks and extracts the nose position.
- The nose coordinates are converted into x, y, z head movement values.
- These values control a virtual OpenGL camera.
- The camera updates perspective dynamically, creating a depth illusion.
- Screen content is captured using MSS.
- The captured frame is mapped as a texture on a front plane.
- A virtual 3D room is rendered around the plane.
- Side walls use reflections to enhance the depth effect.
- Python 3.10.11 (recommended)
- OpenGL (PyOpenGL)
- OpenCV
- MediaPipe
- MSS (screen capture)
- Real-time head tracking
- Perspective-correct parallax rendering
- Pseudo-3D effect on standard monitors
- No special display hardware required
- Python 3.10.11
- Webcam
- OpenGL-compatible GPU
- Second monitor (physical or virtual via GlideX)
git clone https://github.com/Terminator1321/3DEffect_os.git
cd 3DEffect_os
pip install -r requirements.txtpython main.pyMake sure:
- Your webcam is connected
- OpenGL drivers are installed
- The second monitor is configured and active
- AR/VR experiments
- Simulation displays
- Interactive installations
- Research in perspective-based rendering
This is an experimental prototype and may require manual configuration depending on hardware and monitor setup.
https://github.com/Terminator1321/3DEffect_os/blob/main/demo.mp4