Skip to content

ZhouKanglei/Multi-STGAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Task Spatial-Temporal Graph Auto-Encoder for Hand Motion Denoising

This work (Multi-STGAE) extends our conference paper presented at ISMAR 2021, which has been published by TVCG.

Framework

We recommend readers to watch the supplementary video of Multi-STGAE to gain a better understanding of the framework and to view the qualitative results.

Resources: Multi-STGAE video | Multi-STGAE webpage | STGAE webpage

Framework overview of our proposed method Multi-STGAE: we utilize the prediction task to propose a multi-task framework for hand motion denoising. Through this framework, the denoised result is capable of preserving the temporal dynamics and the time delay problem can be greatly alleviated. In this way, it is possible to provide users with a satisfying experience during the interaction.

Datasets

Download the dataset from BUAA cloud disk and save it in the data/dataset name directory. Should you encounter any issues accessing the link or require further details, please do not hesitate to contact us directly.

Installation

To get started, you will need to first clone this project and then install the required dependencies.

Basic packages

Install the required packages:

pip install -r requirements.txt

This will install all the required packages listed in the requirements.txt file.

Visualization dependencies

We also provide a visualization tool for generating video files of the results. To use this tool, the following dependencies need to be installed:

  • LaTeX tool
sudo apt-get install texlive-full
  • FFmpeg
sudo apt-get install ffmpeg
  • pydot & graphviz
sudo pip3 install pydot
sudo pip3 install graphviz

Run and visualizing the results

To run the code, one can either use a configuration file or command parameters. If one wants to use the configuration file, please first configure it in config/dataset name.ymal.

Training from scratch

To train from scratch, run the following command:

python main.py --phase train --dataset nyu --device 0 **args

One can see the meaning of parameters using the following command:

python main.py --help

Testing with pre-trained weights

To test the code with pre-trained weights, run the following command:

python main.py --phase test 

Visualizing the results

To visualize the results, use the following command:

python main.py --phase test --vis True

Contact

Feel free to contact me via zhoukanglei[at]qq.com.

About

The implementation of Multi-STGAE

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages