CoDA: Coordinated Diffusion Noise Optimization for Whole-Body Manipulation of Articulated Objects
Huaijin Pi, Zhi Cen, Zhiyang Dou, Taku Komura
NeurIPS 2025
[March 25, 2026] GRAB-related checkpoints fixed.
[January 5, 2026] GRAB training, evaluation, and visualization code released.
[October 13, 2025] ARCTIC training and evaluation code released.
-
Release ARCTIC training code.
-
Release ARCTIC evaluation code.
-
Release GRAB training code.
-
Release GRAB evaluation code.
-
Release visualization code.
To create the environment, follow the instructions:
- Create new conda environment and install pytroch:
conda create -n coda python=3.10
pip install -r requirements.txt
pip install -e .
-
Download CLIP (
clip-vit-base-patch32) and glove checkpoints (refer to this link). -
Download preprocessed data from OneDrive.
Note that we do not intend to distribute the original datasets, and you need to download them (annotation, videos, etc.) from the original websites. We're unable to provide the original data due to the license restrictions. By downloading the preprocessed data, you agree to the original dataset's terms of use and use the data for research purposes only.
- Download our pretrained checkpoints (including shared models and ARCTIC-specific checkpoints) from OneDrive and evaluator from OneDrive.
6a. For GRAB testing, additionally download checkpoints from OneDrive and evaluator from OneDrive. Place the GRAB checkpoints under inputs/release_checkpoints/.
- Rename these downloaded files and organize them following the file structure:
inputs
├── checkpoints
│ ├── body_models/smplx/
│ │ └── SMPLX_{GENDER}.npz # SMPLX (We predict SMPLX params + evaluation)
│ ├── body_models/smpl/
│ │ └── SMPL_{GENDER}.pkl # SMPL (rendering and evaluation)
│ ├── glove
│ ├── huggingface
│ │ └── clip-vit-base-patch32
│ ├── arcticobj
│ └── grab_short # Only for the GRAB testing
├── amass
├── arctic
├── arctic_neutral
├── grab_extracted
├── grab_neutral
└── release_checkpoints # Place GRAB-specific checkpoints here (from step 6a)
- Calculate the corresponding bps representation.
python tools/preprocess/arctic_bps.py
python tools/preprocess/grab_bps.py
Test with our provided checkpoints on the ARCTIC dataset.
python tools/train.py exp=wholebody/obj_arctic global/task=wholebody/test_arctic
Test with our provided checkpoints on the GRAB dataset.
python tools/train.py exp=wholebody/obj_grab global/task=wholebody/test_grab
After testing, you can visualize the results using Wis3D:
wis3d --vis_dir ./outputs/wis3d --host 0.0.0.0 --port 19090 Then open http://localhost:19090 in your browser to view the visualization results.
-
Train the object trajectory model.
For ARCTIC dataset:
python tools/train.py exp=objtraj/arcticFor GRAB dataset:
python tools/train.py exp=objtraj/grab -
Train the end-effector trajectory model.
For ARCTIC dataset:
python tools/train.py exp=ee/arcticFor GRAB dataset:
python tools/train.py exp=ee/grab -
Train separate motion diffusion models.
python tools/train.py exp=handpose/lefthand_mixed
python tools/train.py exp=handpose/righthand_mixed
python tools/train.py exp=bodypose/mixed
After training, please assign the corresponding checkpoints path in coda/configs/global/task/wholebody/test_arctic.yaml for further evaluation.
If you find this code useful for your research, please use the following BibTeX entry.
@article{pi2025coda,
title={CoDA: Coordinated Diffusion Noise Optimization for Whole-Body Manipulation of Articulated Objects},
author={Pi, Huaijin and Cen, Zhi and Dou, Zhiyang and Komura, Taku},
journal={Advances in Neural Information Processing Systems},
year={2025}
}
We thank the authors of GVHMR, HGHOI, and DNO for their great works, without which our project/code would not be possible.
