Skip to content
Code for the paper "Pose2Seg: Detection Free Human Instance Segmentation" @ CVPR2019.
Python Jupyter Notebook
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data clear Apr 8, 2019
datasets for train. Jan 31, 2019
figures support pose_templates Apr 8, 2019
imagenet_pretrain clear Apr 8, 2019
lib test time done. Feb 1, 2019
modeling fix #7 May 25, 2019
.gitignore clear Apr 8, 2019
README.md Update README.md Jun 7, 2019
cluster_pose.py support pose_templates Apr 8, 2019
requirements.txt add README Apr 8, 2019
test.py support pose_templates Apr 8, 2019
train.py clear Apr 8, 2019
visualize_cluster.ipynb support pose_templates Apr 8, 2019

README.md

Pose2Seg

Official code for the paper "Pose2Seg: Detection Free Human Instance Segmentation"[ProjectPage][arXiv] @ CVPR2019.

The OCHuman dataset proposed in our paper is released here

Pipeline of our pose-based instance segmentation framework.

Setup environment

pip install cython matplotlib tqdm opencv-python scipy pyyaml numpy
pip install torchvision torch

cd ~/github-public/cocoapi/PythonAPI/
python setup.py build_ext install
cd -

Download data

Note: person_keypoints_(train/val)2017_pose2seg.json is a subset of person_keypoints_(train/val)2017.json (in COCO2017 Train/Val annotations). We choose those instances with both keypoint and segmentation annotations for our experiments.

Setup data

The data folder should be like this:

data  
├── coco2017
│   ├── annotations  
│   │   ├── person_keypoints_train2017_pose2seg.json 
│   │   ├── person_keypoints_val2017_pose2seg.json 
│   ├── train2017  
│   │   ├── ####.jpg  
│   ├── val2017  
│   │   ├── ####.jpg  
├── OCHuman 
│   ├── annotations  
│   │   ├── ochuman_coco_format_test_range_0.00_1.00.json   
│   │   ├── ochuman_coco_format_val_range_0.00_1.00.json   
│   ├── images  
│   │   ├── ####.jpg 

How to train

python train.py

Note: Currently we only support for single-gpu training.

How to test

This allows you to test the model on (1) COCOPersons val set and (2) OCHuman val & test set.

python test.py --weights last.pkl --coco --OCHuman

We retrained our model using this repo, and got similar results with our paper. The final weights can be download here.

About Human Pose Templates in COCO

Pose templates clustered using K-means on COCO.

This repo already contains a template file modeling/templates.json which was used in our paper. But you are free to explore different cluster parameters as discussed in our paper. See visualize_cluster.ipynb for an example.

You can’t perform that action at this time.