Skip to content
Tensorflow implementation of our NIPS 2017 paper "Pose Guided Person Image Generation"
Python Shell
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data update Jun 13, 2018
datasets add openpose key-points coordinates convert scripts Sep 21, 2019
imgs Add files via upload Jan 18, 2018
tflib Add training data & code & script May 22, 2018
LICENSE Create LICENSE May 30, 2018
README.md Update README.md May 31, 2019
config.py Change file name May 26, 2018
main.py bugs fixed May 21, 2018
models.py bugs fixed May 21, 2018
models256.py initial Jan 11, 2018
run_DF_test.sh Update run_DF_test.sh Dec 14, 2018
run_DF_train.sh Update run_DF_train.sh Dec 14, 2018
run_market_test.sh Update run_market_test.sh Dec 14, 2018
run_market_train.sh Update run_market_train.sh Dec 14, 2018
score.py Update score.py Dec 14, 2018
score_mask.py Update score_mask.py Dec 14, 2018
trainer.py fix bugs Jun 3, 2018
trainer256.py fix bugs Jun 3, 2018
utils.py initial Jan 11, 2018
utils_wgan.py initial Jan 11, 2018
wgan_gp.py Change file name May 26, 2018

README.md

Pose-Guided-Person-Image-Generation

Tensorflow implementation of NIPS 2017 paper Pose Guided Person Image Generation

alt text

Network architecture

alt text

Dependencies

  • python 2.7
  • tensorflow-gpu (1.4.1)
  • numpy (1.14.0)
  • Pillow (5.0.0)
  • scikit-image (0.13.0)
  • scipy (1.0.1)
  • matplotlib (2.0.0)

Resources

TF-record data preparation steps

You can skip this data preparation procedure if directly using the tf-record data files.

  1. cd datasets
  2. ./run_convert_market.sh to download and convert the original images, poses, attributes, segmentations
  3. ./run_convert_DF.sh to download and convert the original images, poses

Note: we also provide the convert code for Market-1501 Attribute and Market-1501 Segmentation results from PSPNet. These extra info, are provided for further research. In our experiments, pose mask are obtained from pose key-points (see _getPoseMask function in convert .py files).

Training steps

  1. Download the tf-record training data.
  2. Modify the model_dir in the run_market_train.sh/run_DF_train.sh scripts.
  3. run run_market_train.sh/run_DF_train.sh

Note: we use a triplet instead of pair real/fake for adversarial training to keep training more stable.

Testing steps

  1. Download the pretrained models and tf-record testing data.
  2. Modify the model_dir in the run_market_test.sh/run_DF_test.sh scripts.
  3. run run_market_test.sh/run_DF_test.sh

Other implementations

Pytorch implementation Human-Pose-Transfer

Citation

@inproceedings{ma2017pose,
  title={Pose guided person image generation},
  author={Ma, Liqian and Jia, Xu and Sun, Qianru and Schiele, Bernt and Tuytelaars, Tinne and Van Gool, Luc},
  booktitle={Advances in Neural Information Processing Systems},
  pages={405--415},
  year={2017}
}

Related projects

You can’t perform that action at this time.