A web application for browsing the BONES-SEED motion capture dataset. Renders skeleton animations on 3D character models in the browser using Three.js, with a searchable file browser, metadata panel, and temporal label overlay.
Supports two character models:
- SOMA body model
- Unitree G1 robot
Temporal labels were created by NVIDIA for the Kimodo project.
Requires Docker (includes Docker Compose).
git clone <repo-url>
cd seed-viewer
cp .env.example .env
# Edit .env — set DATA_PATH to your extracted BONES-SEED dataset root
./shdocker.sh # Linux / macOS
# or
docker compose up --build # any platformThe app will be available at http://localhost:8666.
DATA_PATH should point to the root of the extracted dataset:
DATA_PATH/
metadata/ # Parquet metadata + jsonl temporal labels
soma_proportional/bvh/ # Original mocap on SOMA — BVH files
soma_uniform/bvh/ # Mocap retargeted to unified SOMA shape — BVH files
g1/csv/ # Mocap retargeted to G1 robot — MuJoCo compatible CSV files
soma_shapes/ # SOMA shape parameters
For development without Docker:
- Node.js 20.10.0 (see
.nvmrc) - Python 3.11
- PDM for Python dependencies
Backend (terminal 1):
cd backend
pdm install
DATA_ROOT=/path/to/dataset PORT=8080 python src/main.pyFrontend (terminal 2):
cd frontend
npm install
npm run dev # Vite dev server on localhost:5173, proxies /api → localhost:8080End-to-end tests use Playwright against the running app at http://localhost:8666:
cd frontend
npx playwright install # first time only
npx playwright test # headless Chromium
npx playwright test --headed # with visible browserThe soma/ directory contains a minimal Python example for parsing BONES-SEED motion capture data and running it through the SOMA body model.
BONES-SEED is part of a larger effort to enable humanoid motion data for robotics, physical AI, and other applications.
Check out these related works:
- SOMA Body Model - Parametric human body model with standardized skeleton, mesh, and shape parameters
- SOMA Retargeter
- GEM-X - Human motion estimation from video
- Kimodo - Kinematic motion diffusion model for text and constraint-driven 3D human and robot motion generation
- ProtoMotions - GPU-accelerated simulation and learning framework for training physically simulated digital humans and humanoid robots
- SONIC - Whole-body control for humanoid robots, training locomotion and interaction policies
Apache 2.0 — see LICENSE.