Skip to content
This repository has been archived by the owner on Dec 24, 2020. It is now read-only.

Simple baselines and RNNs for predicting human motion. Presented at CVPR 17. This pytorch implementation is based on the original one by the authors

License

Notifications You must be signed in to change notification settings

ornithos/human-motion-prediction-pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Human Motion Prediction using Martinez et al. model

Fork => use for locomotion style data from Mason et al. 2018 (Few shot learning).

While the pytorch fork of Martinez' original code quietly removes some features (e.g. the option of residual layer; this is now always on, despite the argument. Make sure the args are plumbed in before you use them!), I've bluntly destroyed a lot more. Actions are ignored, and one-hot inputs are also not currently present. Hence I highly highly recommend you following the fork back to the original code, or at least the pytorch version. Nevertheless, the following commands apparently work:

Train:

python translate.py --action walking --seq_length_out 64 --iterations 5000  --style_ix 1 --learning_rate 0.005

Generate data:

python translate.py --action walking --seq_length_out 64 --style_ix ${i} --use_cpu --load ../experiments/model_${i}_5000_res --sample

Original (pytorch) readme below:


human-motion-prediction

This is a pytorch implementation of the paper

Julieta Martinez, Michael J. Black, Javier Romero. On human motion prediction using recurrent neural networks. In CVPR 17.

It can be found on arxiv as well: https://arxiv.org/pdf/1705.02445.pdf

The code in the original repository was written by Julieta Martinez and Javier Romero and is accessible here.

If you have any comment on this fork you can email me at [enriccorona93@gmail.com]

Dependencies

Get this code and the data

First things first, clone this repo and get the human3.6m dataset on exponential map format.

git clone https://github.com/enriccorona/human-motion-prediction-pytorch.git
cd human-motion-prediction-pytorch
mkdir data
cd data
wget http://www.cs.stanford.edu/people/ashesh/h3.6m.zip
unzip h3.6m.zip
rm h3.6m.zip
cd ..

Quick demo and visualization

The code in this fork should work exactly as in the original repo:

For a quick demo, you can train for a few iterations and visualize the outputs of your model.

To train, run

python src/translate.py --action walking --seq_length_out 25 --iterations 10000

To save some samples of the model, run

python src/translate.py --action walking --seq_length_out 25 --iterations 10000 --sample --load 10000

Finally, to visualize the samples run

python src/forward_kinematics.py

This should create a visualization similar to this one



Running average baselines

To reproduce the running average baseline results from our paper, run

python src/baselines.py

RNN models

To train and reproduce the results of our models, use the following commands

model arguments training time (gtx 1080) notes
Sampling-based loss (SA) python src/translate.py --action walking --seq_length_out 25 45s / 1000 iters Realistic long-term motion, loss computed over 1 second.
Residual (SA) python src/translate.py --residual_velocities --action walking 35s / 1000 iters
Residual unsup. (MA) python src/translate.py --residual_velocities --learning_rate 0.005 --omit_one_hot 65s / 1000 iters
Residual sup. (MA) python src/translate.py --residual_velocities --learning_rate 0.005 65s / 1000 iters best quantitative.
Untied python src/translate.py --residual_velocities --learning_rate 0.005 --architecture basic 70s / 1000 iters

You can substitute the --action walking parameter for any action in

["directions", "discussion", "eating", "greeting", "phoning",
 "posing", "purchases", "sitting", "sittingdown", "smoking",
 "takingphoto", "waiting", "walking", "walkingdog", "walkingtogether"]

or --action all (default) to train on all actions.

Citing

If you use our code, please cite our work

@inproceedings{julieta2017motion,
  title={On human motion prediction using recurrent neural networks},
  author={Martinez, Julieta and Black, Michael J. and Romero, Javier},
  booktitle={CVPR},
  year={2017}
}

Acknowledgments

The pre-processed human 3.6m dataset and some of our evaluation code (specially under src/data_utils.py) was ported/adapted from SRNN by @asheshjain399.

Licence

MIT

About

Simple baselines and RNNs for predicting human motion. Presented at CVPR 17. This pytorch implementation is based on the original one by the authors

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.5%
  • Shell 0.5%