A depth-reprojection hologram viewer that transforms flat video or images paired with a depth map into an interactive 3D particle field — and places it as a world-anchored hologram in AR space.
DepthShift takes two inputs: a source video/image and a corresponding depth map. It uses a custom GLSL vertex shader to displace a dense particle grid along the Z axis based on per-pixel depth values — turning a flat 2D frame into a parallax-reactive 3D point cloud.
Core pipeline:
- Depth sampling — the depth map is sampled per particle in the vertex shader. Bright pixels push particles toward the viewer, dark pixels recede.
- Particle grid — a uniform grid of geometry points (configurable density) is generated, each mapped to a UV coordinate on the source texture.
- Vertex displacement — each point is displaced on the Z axis by
depth * depthScale, creating volumetric separation between foreground and background elements. - Three view modes:
- Particles — raw point cloud with additive blending and configurable point size
- Mesh — depth-displaced mesh surface rendered as a wireframe or solid
- SBS Stereo — side-by-side stereoscopic output for VR headsets, rendered with two offset cameras simulating eye separation
Interaction:
- Click and drag to orbit the particle field
- Scroll to zoom
- All transforms applied via quaternion rotation to avoid gimbal lock
DepthPlane uses WebXR with plane detection and hand tracking to anchor the depth-reprojected hologram to a real-world surface.
Core pipeline:
- Plane detection — WebXR's
plane-detectionfeature scans the environment and identifies flat surfaces (floors, tables, walls) - Hand raycasting — a ray is cast from the index finger tip; when it intersects a detected plane, a placement indicator appears
- World anchoring — on pinch gesture, the hologram mesh is instantiated at the hit point and locked to world coordinates via an XRAnchor
- Depth mesh — the same vertex-displaced geometry from DepthShift is used, rendered as a textured mesh in world space
Requirements:
- Meta Quest 3 or any WebXR-capable headset with plane detection support
- Served over HTTPS (required for WebXR)
# Clone the repo
git clone https://github.com/yourusername/project-depthshift.git
cd project-depthshift
# Serve over HTTPS (required for WebXR)
npx serve .Open hologram.html in any modern browser for the particle viewer.
For AR, open ar.html on a Meta Quest 3 browser or any WebXR-capable device.
npx vercelVercel automatically serves over HTTPS, which satisfies the WebXR security requirement.
The assets/ folder contains sample source and depth map videos to test with:
| File | Description |
|---|---|
assets/src_small.mp4 |
Source video |
assets/depth_small.mp4 |
Corresponding depth map |
Load both in the controls panel — source video first, then depth map — to see the effect.
- Three.js — 3D rendering and shader pipeline
- WebXR Device API — AR session, plane detection, hand tracking
- GLSL — custom vertex shader for depth displacement