Skip to content
Pytorch implementation of Pose Partition Networks for Multi-Person Pose Estimation (ECCV'18)
Branch: master
Clone or download
Latest commit 1b79e0f Aug 29, 2018
Type Name Latest commit message Commit time
Failed to load latest commit information.
dataset/mpi First Commit Aug 19, 2018
exps Upload example mat file of predicton Aug 21, 2018
nets First Commit Aug 19, 2018
utils Update Aug 29, 2018 Fixed bugs Aug 22, 2018 Update args Aug 22, 2018 First Commit Aug 19, 2018

Pose Partition Networks for Multi-Person Pose Estimation

This repository contains the code and pretrained models of

Pose Partition Networks for Multi-Person Pose Estimation [PDF]
Xuecheng Nie, Jiashi Feng, Junliang Xing, and Shuicheng Yan
European Conference on Computer Vision (ECCV), 2018


  • Python 3.5
  • Pytorch 0.2.0
  • OpenCV 3.0 or higher


  1. Install Pytorch: Please follow the official instruction on installation of Pytorch.
  2. Clone the repository
    git clone --recursive
  3. Download MPII Multi-Person Human Pose dataset and create a symbolic link to the images directory
    ln -s PATH_TO_MPII_IMAGES_DIR dataset/mpi/images



Run the following command to train PPN from scratch with 8-stack of Hourglass network as backbone:




A simple way to record the training log by adding the following command:

2>&1 | tee exps/logs/ppn.log

A script to supervise validation accuracy during the training process:

python utils/

Some configurable parameters in training phase:

  • -b mini-batch size
  • --lr initial learning rate
  • --epochs total number of epochs for training
  • --snapshot-fname-prefix prefix of file name for snapshot, e.g. if set '--snapshot-fname-prefix exps/snapshots/ppn', then 'ppn.pth.tar' (latest model) and 'ppn_best.pth.tar' (model with best validation accuracy) will be generated in the folder 'exps/snapshots'
  • --resume path to the model for recovering training
  • -j number of workers for loading data
  • --print-freq print frequency

*Training log ppn.log is uploaded to the folder exps/logs for the reference.


Run the following command to evaluate PPN on MPII validation set:



CUDA_VISIBLE_DEVICES=0 python --evaluate True --calc-map True --resume exps/snapshots/ppn_best.pth.tar

Run the following command to evaluate PPN on MPII testing set:

CUDA_VISIBLE_DEVICES=0 python --evaluate True --resume exps/snapshots/ppn_best.pth.tar --eval-anno dataset/mpi/jsons/MPI_MP_TEST_annotations.json

In particular, results will be saved as a .mat file followed the official evaluation format of MPII Multi-Person Human Pose. An example is provided in exps/preds/mat_results/pred_keypoints_mpii_multi.mat.

Some configurable parameters in testing phase:

  • --evaluate True for testing and false for training
  • --resume path to the model for evaluation
  • --calc-map calculate mAP or not
  • --pred-path path to the mat file for saving the evaluation results
  • --visualization visualize evaluation or not
  • --vis-dir directory for saving the visualization results

The pretrained model and its performance (measured by mAP) on MPII validation set with this code:

Method Head Shoulder Elbow Wrist Hip Knee Ankle Avg. Pretrained Model
PPN 94.0 90.9 81.2 74.1 77.1 73.4 67.5 79.7 GoogleDrive

*The Single-Person pose estimation model to refine Multi-Person pose estimation results will be released soon.


If you use our code/model in your work or find it is helpful, please cite the paper:

  title={Pose Partition Networks for Multi-Person Pose Estimation},
  author={Nie, Xuecheng and Feng, Jiashi and Xing, Junliang and Yan, Shuicheng},
You can’t perform that action at this time.