A TypeScript three.js WebGPU inference port of the biped + quadruped neural-motion-matching demos from facebookresearch/ai4animationpy by Paul Starke and Sebastian Starke.
Live demo: motionsynth.sweriko.com
- Runs the upstream biped + quadruped neural networks directly in the browser
- Batched compute: one dispatch per prediction tick across every active agent, with mixed-precision matmul kernels (fp16 weights, fp32 accumulator).
- All agents share a single instanced
SkinnedMesh - basic controll interface
npm install
npm run dev This is a derivative work of AI4AnimationPy. See NOTICE for full attribution and LICENSE for the license terms (CC BY-NC 4.0 — non-commercial use only). The "ai4anim-" prefix is descriptive; this project is not affiliated with or endorsed by Meta or the original authors.