Skip to content
2D human pose estimation with DSNT
Branch: master
Clone or download

2D human pose estimation with DSNT

This repository contains code for training and evaluating a ResNet or Stacked Hourglass model on the MPII Human Pose dataset using the differentiable spatial to numerical transform (DSNT).

If you want to use DSNT in your own project, check out dsntnn instead. dsntnn is a small, self-contained library containing all of the operations required for DSNT, the loss function, and regularization terms.



Edit docker-compose.yml to set the desired location for the MPII Human Pose dataset on your computer.

Next, download and install the MPII Human Pose dataset:

$ ./ python
>>> from torchdata import mpii
>>> mpii.install_mpii_dataset('/datasets/mpii')

Running scripts


$ ./ pytest


  1. [Optional] Start the Showoff server. Showoff is a visualisation server which can be used to display metrics while training.
    $ docker-compose up -d showoff
  2. Run the training script (pass --showoff="" if not using Showoff).
    $ ./ src/dsnt/bin/ --epochs=100
  3. Wait until the training finishes. If using Showoff, you can monitor progress by going to http://localhost:16676.


bin/ may be used to generate predictions from trained models on the MPII dataset. The predictions can be written to HDF5 files compatible with eval-mpii-pose. This is especially useful for generating Matlab submission files which are compatible with the official MPII evaluation code.

Other implementations

If you write an implementation of DSNT, please let me know so that I can add it to the list.

License and citation

(C) 2017-2018 Aiden Nibali

This project is open source under the terms of the Apache License 2.0.

If you use any part of this work in a research project, please cite the following paper:

  title={Numerical Coordinate Regression with Convolutional Neural Networks},
  author={Nibali, Aiden and He, Zhen and Morgan, Stuart and Prendergast, Luke},
  journal={arXiv preprint arXiv:1801.07372},
You can’t perform that action at this time.