Production deploy artifacts for the relay JEPA contact-push puzzle — 26M-param ViT-Tiny + AR transformer predictor + DexWM joint state head, trained on actor-pattern-corrected data with a uniform-sampled head decoder. Plans via CEM in latent space.
Runs on the JEPA inference VPS at jepa.waweapps.win/relay/ behind a Caddy
reverse proxy (jepa-vps-proxy).
A self-contained code subset of the private aura research repo — only what's needed to run the inference server. No training scripts you could use to reproduce the model (those stay in the private repo), no data, no notebooks.
Dockerfile— CPU PyTorch + DINOv2 weights pre-downloaded at build timedocker-compose.yml— joins the shared externalwebDocker networkworld_model/,scripts/,client/relay/— minimum source for the FastAPI inference server + the browser clientcheckpoints/— gitignored; populate via scp
Prereqs: Caddy proxy already running on the web network with /relay/* →
relay:8800 routing (see jepa-vps-proxy repo).
ssh jepa-vps
cd /srv/relay
git clone https://github.com/SotoAlt/relay-deploy.git .
docker compose up -d --buildThen from your Mac, scp the checkpoint over (one-time):
scp /path/to/relay_stage1_v9_trackE_ep02_uhead.pt \
jepa-vps:/srv/relay/checkpoints/
ssh jepa-vps 'cd /srv/relay && docker compose restart'Verify: open https://jepa.waweapps.win/relay/.
ssh jepa-vps 'cd /srv/relay && git pull && docker compose up -d --build'Caddy and other apps (lepong, …) are not touched. Independent-deploy-safe by design.