Skip to content

MoCaNet: Motion Retargeting in-the-wild via Canonicalization Networks (AAAI 2022)

Notifications You must be signed in to change notification settings

Walter0807/mocanet.pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MoCaNet: Motion Retargeting in-the-wild via Canonicalization Networks

This is the official PyTorch implementation of MoCaNet: Motion Retargeting in-the-wild via Canonicalization Networks (AAAI 2022).

Paper | Project Page | Poster

fig_cano_idea

Environment

conda install pytorch torchvision cudatoolkit=10.1 -c pytorch
conda install pyyaml scikit-image scikit-learn opencv
pip install -r requirements.txt

Data

We use the Mixamo and SoloDancer datasets. Please refer to TransMoMo for detailed instructions.

Training

for descriptions of training options, do

python train.py --help

To train on Mixamo

python train.py --config configs/mocanet.yaml

To train on SoloDancer

python train.py --config configs/mocanet_solodancer.yaml

Citation

If you use our code or models in your research, please cite with:

W. Zhu*, Z. Yang*, Z. Di, W. Wu+, Y. Wang, C. C. Loy. "MoCaNet: Motion Retargeting in-the-wild via Canonicalization Networks." Thirty-Sixth AAAI Conference on Artificial Intelligence (AAAI), 2022.

BibTeX:

@inproceedings{mocanet2022,
    title={MoCaNet: Motion Retargeting in-the-wild via Canonicalization Networks},
    author={Zhu, Wentao and Yang, Zhuoqian and Di, Ziang and Wu, Wayne and Wang, Yizhou and Loy, Chen Change},
    booktitle={AAAI},
    year={2022}
}

About

MoCaNet: Motion Retargeting in-the-wild via Canonicalization Networks (AAAI 2022)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages