Official implementation of the paper: Sub-JEPA: Subspace Gaussian Regularization for Stable End-to-End World Models.
Joint-Embedding Predictive Architectures (JEPAs) offer an effective framework for learning world models, but without sufficient constraints, their representation spaces can easily collapse. While recent methods like LeWM prevent this by enforcing a strong isotropic Gaussian prior in the high-dimensional ambient space, this can introduce an overly strong structural bias.
Sub-JEPA relaxes this global constraint by applying Gaussian regularization across multiple random subspaces instead of the original embedding space.
Sub-JEPA consistently improves over LeWM across four continuous-control environments.
| Method | Two-Room | Reacher | PushT | OGB-Cube |
|---|---|---|---|---|
| LeWM | 84.33 ± 4.23 | 82.67 ± 4.42 | 84.67 ± 6.53 | 67.33 ± 5.01 |
| Sub-JEPA | 95.00 ± 2.76 | 84.00 ± 4.00 | 89.00 ± 5.33 | 76.33 ± 5.99 |
Sub-JEPA/
├── le-wm/ # Upstream LeWM codebase as a git submodule
├── subjepa.py # MultiSubspaceSIGReg implementation
├── lewm_subjepa.patch # Patch that integrates Sub-JEPA into LeWM
└── README.md
git clone --recursive https://github.com/intcomp/Sub-JEPA.git
cd Sub-JEPAIf you forgot --recursive, you can run:
git submodule update --init --recursivegit -C le-wm apply ../lewm_subjepa.patchFollow the upstream LeWM instructions:
- Installation: https://github.com/lucas-maes/le-wm#using-the-code
- Data layout: https://github.com/lucas-maes/le-wm#data
In particular, LeWM (and this repo) expects datasets and checkpoints under $STABLEWM_HOME (defaults to ~/.stable-wm/).
We provide pretrained Sub-JEPA checkpoints on Hugging Face.
You can download all released checkpoints with:
pip install -U huggingface_hub
hf download intcomp/sub-jepaTraining is identical to LeWM and is configured with Hydra. The patch modifies le-wm/train.py and le-wm/config/train/lewm.yaml to use multi-subspace regularization.
PYTHONPATH=. python le-wm/train.py data=tworoomMake sure to set your WandB entity and project in le-wm/config/train/lewm.yaml, or disable WandB:
PYTHONPATH=. python le-wm/train.py data=tworoom wandb.enabled=falseAll Sub-JEPA knobs live under loss.sigreg in le-wm/config/train/lewm.yaml.
| Hyperparameter | Description |
|---|---|
loss.sigreg.weight |
Overall regularization weight. Same role as the LeWM SIGReg weight. |
loss.sigreg.kwargs.num_subspaces |
Number of subspaces, denoted as |
loss.sigreg.kwargs.subspace_dim |
Dimension of each subspace null, uses embed_dim / K and requires divisibility. |
loss.sigreg.kwargs.init_mode |
Projection initialization mode. |
loss.sigreg.theta |
Soft orthogonality penalty weight. Only used by trainable projection variants. |
Example: To train the model using the default configuration described in our paper, run:
CUDA_VISIBLE_DEVICES=0 PYTHONPATH=. python le-wm/train.py \
data=tworoom \
subdir=tworoom/subjepa \
loss.sigreg.kwargs.init_mode=orthogonal_frozen \
loss.sigreg.kwargs.num_subspaces=32 \
trainer.max_epochs=10Evaluation configs are located under le-wm/config/eval/.
python le-wm/eval.py --config-name=tworoom.yaml policy=tworoom/subjepapolicy must be the checkpoint path relative to $STABLEWM_HOME, without the _object.ckpt suffix.
Example: Full evaluation with our paper's default seeds:
python le-wm/eval.py --config-name=tworoom.yaml policy=tworoom/subjepa seed=42,100,2026,3407,1234,4444 --multirunThis codebase is built on top of the official LeWorldModel implementation. We thank the authors of LeWM for releasing their codebase.
If you find our work useful in your research, please consider citing:
@misc{zhao2026subjepa,
title = {Sub-JEPA: Subspace Gaussian Regularization for Stable End-to-End World Models},
author = {Zhao, Kai and Nie, Dongliang and Lin, Yuchen and Luo, Zhehan and Gu, Yixiao and Fan, Deng-Ping and Zeng, Dan},
year = {2026},
eprint = {2605.09241},
archivePrefix = {arXiv},
primaryClass = {cs.LG},
url = {https://arxiv.org/abs/2605.09241}
}