Skip to content

projectmehari/depth-shift

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Project Depthshift

A depth-reprojection hologram viewer that transforms flat video or images paired with a depth map into an interactive 3D particle field — and places it as a world-anchored hologram in AR space.


How It Works

DepthShift — Video Hologram (hologram.html)

DepthShift takes two inputs: a source video/image and a corresponding depth map. It uses a custom GLSL vertex shader to displace a dense particle grid along the Z axis based on per-pixel depth values — turning a flat 2D frame into a parallax-reactive 3D point cloud.

Core pipeline:

  1. Depth sampling — the depth map is sampled per particle in the vertex shader. Bright pixels push particles toward the viewer, dark pixels recede.
  2. Particle grid — a uniform grid of geometry points (configurable density) is generated, each mapped to a UV coordinate on the source texture.
  3. Vertex displacement — each point is displaced on the Z axis by depth * depthScale, creating volumetric separation between foreground and background elements.
  4. Three view modes:
    • Particles — raw point cloud with additive blending and configurable point size
    • Mesh — depth-displaced mesh surface rendered as a wireframe or solid
    • SBS Stereo — side-by-side stereoscopic output for VR headsets, rendered with two offset cameras simulating eye separation

Interaction:

  • Click and drag to orbit the particle field
  • Scroll to zoom
  • All transforms applied via quaternion rotation to avoid gimbal lock

DepthPlane — Spatial Hologram (ar.html)

DepthPlane uses WebXR with plane detection and hand tracking to anchor the depth-reprojected hologram to a real-world surface.

Core pipeline:

  1. Plane detection — WebXR's plane-detection feature scans the environment and identifies flat surfaces (floors, tables, walls)
  2. Hand raycasting — a ray is cast from the index finger tip; when it intersects a detected plane, a placement indicator appears
  3. World anchoring — on pinch gesture, the hologram mesh is instantiated at the hit point and locked to world coordinates via an XRAnchor
  4. Depth mesh — the same vertex-displaced geometry from DepthShift is used, rendered as a textured mesh in world space

Requirements:

  • Meta Quest 3 or any WebXR-capable headset with plane detection support
  • Served over HTTPS (required for WebXR)

Running Locally

# Clone the repo
git clone https://github.com/yourusername/project-depthshift.git
cd project-depthshift

# Serve over HTTPS (required for WebXR)
npx serve .

Open hologram.html in any modern browser for the particle viewer. For AR, open ar.html on a Meta Quest 3 browser or any WebXR-capable device.


Deploying to Vercel

npx vercel

Vercel automatically serves over HTTPS, which satisfies the WebXR security requirement.


Sample Files

The assets/ folder contains sample source and depth map videos to test with:

File Description
assets/src_small.mp4 Source video
assets/depth_small.mp4 Corresponding depth map

Load both in the controls panel — source video first, then depth map — to see the effect.


Stack

  • Three.js — 3D rendering and shader pipeline
  • WebXR Device API — AR session, plane detection, hand tracking
  • GLSL — custom vertex shader for depth displacement

About

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages