Skip to content
Graphonomy: Universal Human Parsing via Graph Transfer Learning
Python Shell
Branch: master
Clone or download
Latest commit 857086d Jul 27, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data/datasets init Apr 11, 2019
dataloaders init Apr 11, 2019
exp fix bugs Jul 20, 2019
img init Apr 11, 2019
networks fix bugs Jul 27, 2019
sync_batchnorm
utils init Apr 11, 2019
LICENSE Create LICENSE Apr 11, 2019
README.md Update README.md Apr 19, 2019
eval_cihp.sh
eval_pascal.sh init Apr 12, 2019
inference.sh init Apr 12, 2019
requirements init Apr 11, 2019
train_transfer_cihp.sh init Apr 12, 2019
train_universal.sh init Apr 12, 2019

README.md

Graphonomy: Universal Human Parsing via Graph Transfer Learning

This repository contains the code for the paper:

Graphonomy: Universal Human Parsing via Graph Transfer Learning ,Ke Gong, Yiming Gao, Xiaodan Liang, Xiaohui Shen, Meng Wang, Liang Lin.

Environment and installation

  • Pytorch = 0.4.0

  • torchvision

  • scipy

  • tensorboardX

  • numpy

  • opencv-python

  • matplotlib

  • networkx

    you can install above package by using pip install -r requirements.txt

Getting Started

Data Preparation

  • You need to download the human parsing dataset, prepare the images and store in /data/datasets/dataset_name/. We recommend to symlink the path to the dataets to /data/dataset/ as follows
# symlink the Pascal-Person-Part dataset for example
ln -s /path_to_Pascal_Person_Part/* data/datasets/pascal/
  • The file structure should look like:
/Graphonomy
  /data
    /datasets
      /pascal
        /JPEGImages
        /list
        /SegmentationPart
      /CIHP_4w
        /Images
        /lists
        ...  
  • The datasets (CIHP & ATR) are available at google drive and baidu drive. And you also need to download the label with flipped. Download cihp_flipped, unzip and store in data/datasets/CIHP_4w/. Download atr_flip, unzip and store in data/datasets/ATR/.

Inference

We provide a simply script to get the visualization result on the CIHP dataset using trained models as follows :

# Example of inference
python exp/inference/inference.py  \
--loadmodel /path_to_inference_model \
--img_path ./img/messi.jpg \
--output_path ./img/ \
--output_name /output_file_name

Training

Transfer learning

  1. Download the Pascal pretrained model(available soon).
  2. Run the sh train_transfer_cihp.sh.
  3. The results and models are saved in exp/transfer/run/.
  4. Evaluation and visualization script is eval_cihp.sh. You only need to change the attribute of --loadmodel before you run it.

Universal training

  1. Download the pretrained model and store in /data/pretrained_model/.
  2. Run the sh train_universal.sh.
  3. The results and models are saved in exp/universal/run/.

Testing

If you want to evaluate the performance of a pre-trained model on PASCAL-Person-Part or CIHP val/test set, simply run the script: sh eval_cihp/pascal.sh. Specify the specific model. And we provide the final model that you can download and store it in /data/pretrained_model/.

Models

Pascal-Person-Part trained model

Model Google Cloud Baidu Yun
Graphonomy(CIHP) Download Available soon

CIHP trained model

Model Google Cloud Baidu Yun
Graphonomy(PASCAL) Download Available soon

Universal trained model

Model Google Cloud Baidu Yun
Universal Download Available soon

Todo:

  • release pretrained and trained models
  • update universal eval code&script

Citation

@inproceedings{Gong2019Graphonomy,
author = {Ke Gong and Yiming Gao and Xiaodan Liang and Xiaohui Shen and Meng Wang and Liang Lin},
title = {Graphonomy: Universal Human Parsing via Graph Transfer Learning},
booktitle = {CVPR},
year = {2019},
}

Contact

if you have any questions about this repo, please feel free to contact gaoym9@mail2.sysu.edu.cn.

Related work

  • Self-supervised Structure-sensitive Learning SSL
  • Joint Body Parsing & Pose Estimation Network JPPNet
  • Instance-level Human Parsing via Part Grouping Network PGN
You can’t perform that action at this time.