This work (Multi-STGAE) extends our conference paper presented at ISMAR 2021, which has been published by TVCG.
We recommend readers to watch the supplementary video of Multi-STGAE to gain a better understanding of the framework and to view the qualitative results.
Resources: Multi-STGAE video | Multi-STGAE webpage | STGAE webpage
Framework overview of our proposed method Multi-STGAE: we utilize the prediction task to propose a multi-task framework for hand motion denoising. Through this framework, the denoised result is capable of preserving the temporal dynamics and the time delay problem can be greatly alleviated. In this way, it is possible to provide users with a satisfying experience during the interaction.Download the dataset from BUAA cloud disk and save it in the data/dataset name
directory. Should you encounter any issues accessing the link or require further details, please do not hesitate to contact us directly.
To get started, you will need to first clone this project and then install the required dependencies.
Install the required packages:
pip install -r requirements.txt
This will install all the required packages listed in the requirements.txt
file.
We also provide a visualization tool for generating video files of the results. To use this tool, the following dependencies need to be installed:
LaTeX
tool
sudo apt-get install texlive-full
FFmpeg
sudo apt-get install ffmpeg
pydot & graphviz
sudo pip3 install pydot
sudo pip3 install graphviz
To run the code, one can either use a configuration file or command parameters. If one wants to use the configuration file, please first configure it in config/dataset name.ymal
.
To train from scratch, run the following command:
python main.py --phase train --dataset nyu --device 0 **args
One can see the meaning of parameters using the following command:
python main.py --help
To test the code with pre-trained weights, run the following command:
python main.py --phase test
To visualize the results, use the following command:
python main.py --phase test --vis True
Feel free to contact me via zhoukanglei[at]qq.com
.