Skip to content

phj128/CoDA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CoDA: Coordinated Diffusion Noise Optimization for Whole-Body Manipulation of Articulated Objects

CoDA: Coordinated Diffusion Noise Optimization for Whole-Body Manipulation of Articulated Objects
Huaijin Pi, Zhi Cen, Zhiyang Dou, Taku Komura
NeurIPS 2025

News

[March 25, 2026] GRAB-related checkpoints fixed.

[January 5, 2026] GRAB training, evaluation, and visualization code released.

[October 13, 2025] ARCTIC training and evaluation code released.

TODOs

  • Release ARCTIC training code.

  • Release ARCTIC evaluation code.

  • Release GRAB training code.

  • Release GRAB evaluation code.

  • Release visualization code.

Dependencies

To create the environment, follow the instructions:

  1. Create new conda environment and install pytroch:
conda create -n coda python=3.10
pip install -r requirements.txt
pip install -e .
  1. Download SMPL paramters from SMPL and SMPLX.

  2. Download CLIP (clip-vit-base-patch32) and glove checkpoints (refer to this link).

  3. Download ARCTIC dataset and GRAB dataset.

  4. Download preprocessed data from OneDrive.

Note that we do not intend to distribute the original datasets, and you need to download them (annotation, videos, etc.) from the original websites. We're unable to provide the original data due to the license restrictions. By downloading the preprocessed data, you agree to the original dataset's terms of use and use the data for research purposes only.

  1. Download our pretrained checkpoints (including shared models and ARCTIC-specific checkpoints) from OneDrive and evaluator from OneDrive.

6a. For GRAB testing, additionally download checkpoints from OneDrive and evaluator from OneDrive. Place the GRAB checkpoints under inputs/release_checkpoints/.

  1. Rename these downloaded files and organize them following the file structure:
inputs
├── checkpoints
│   ├── body_models/smplx/
│   │   └── SMPLX_{GENDER}.npz # SMPLX (We predict SMPLX params + evaluation)
│   ├── body_models/smpl/
│   │   └── SMPL_{GENDER}.pkl  # SMPL (rendering and evaluation)
│   ├── glove
│   ├── huggingface
│   │   └── clip-vit-base-patch32
│   ├── arcticobj
│   └── grab_short  # Only for the GRAB testing
├── amass
├── arctic
├── arctic_neutral
├── grab_extracted
├── grab_neutral
└── release_checkpoints  # Place GRAB-specific checkpoints here (from step 6a)
  1. Calculate the corresponding bps representation.
python tools/preprocess/arctic_bps.py
python tools/preprocess/grab_bps.py

Evaluation

Test with our provided checkpoints on the ARCTIC dataset.

python tools/train.py exp=wholebody/obj_arctic global/task=wholebody/test_arctic

Test with our provided checkpoints on the GRAB dataset.

python tools/train.py exp=wholebody/obj_grab global/task=wholebody/test_grab

Visualization

After testing, you can visualize the results using Wis3D:

wis3d --vis_dir ./outputs/wis3d --host 0.0.0.0 --port 19090 

Then open http://localhost:19090 in your browser to view the visualization results.

Training

  1. Train the object trajectory model.

    For ARCTIC dataset:

    python tools/train.py exp=objtraj/arctic
    

    For GRAB dataset:

    python tools/train.py exp=objtraj/grab
    
  2. Train the end-effector trajectory model.

    For ARCTIC dataset:

    python tools/train.py exp=ee/arctic
    

    For GRAB dataset:

    python tools/train.py exp=ee/grab
    
  3. Train separate motion diffusion models.

python tools/train.py exp=handpose/lefthand_mixed
python tools/train.py exp=handpose/righthand_mixed
python tools/train.py exp=bodypose/mixed

After training, please assign the corresponding checkpoints path in coda/configs/global/task/wholebody/test_arctic.yaml for further evaluation.

Citation

If you find this code useful for your research, please use the following BibTeX entry.

@article{pi2025coda,
  title={CoDA: Coordinated Diffusion Noise Optimization for Whole-Body Manipulation of Articulated Objects},
  author={Pi, Huaijin and Cen, Zhi and Dou, Zhiyang and Komura, Taku},
  journal={Advances in Neural Information Processing Systems},
  year={2025}
}

Acknowledgement

We thank the authors of GVHMR, HGHOI, and DNO for their great works, without which our project/code would not be possible.

About

Code for "CoDA: Coordinated Diffusion Noise Optimization for Whole-Body Manipulation of Articulated Objects", NeurIPS 2025

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages