Skip to content

The pytorch implementation for AAAI2024 paper "Full-Body Motion Reconstruction with Sparse Sensing from Graph Perspective".

License

Notifications You must be signed in to change notification settings

Feiyu-Yao/Full-Body-Motion-Reconstruction-with-Sparse-Sensing-from-Graph-Perspective

Repository files navigation

Full-Body-Motion-Reconstruction-with-Sparse-Sensing-from-Graph-Perspective

The pytorch implementation for AAAI2024 paper "Full-Body Motion Reconstruction with Sparse Sensing from Graph Perspective".

Datasets

  1. Download the dataset AMASS from AMASS.
  2. Download the body model from http://smpl.is.tue.mpg.de and placed them in support_data/body_models directory of this repository.
  3. Run prepare_data.py to prepare data from VR device. The data is split referring to the folder data_split.

Training

For training, please run:

python train.py

Testing

For testing, please run:

python test.py

Pretrained Models

Click Pretrained Models to download our pretrained model, and put it into results/Avatar/models/.

Citation

@article{Yao_Wu_Yi_2024,
title={Full-Body Motion Reconstruction with Sparse Sensing from Graph Perspective},
volume={38},
url={https://ojs.aaai.org/index.php/AAAI/article/view/28483},
DOI={10.1609/aaai.v38i7.28483},
number={7},
journal={Proceedings of the AAAI Conference on Artificial Intelligence},
author={Yao, Feiyu and Wu, Zongkai and Yi, Li},
year={2024},
month={Mar.},
pages={6612-6620}
}

License and Acknowledgement

This project is released under the MIT license. We refer to the code framework in AvatarPoser and SCI-NET for network training.

About

The pytorch implementation for AAAI2024 paper "Full-Body Motion Reconstruction with Sparse Sensing from Graph Perspective".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages