Skip to content
Monocular depth prediction with PyTorch
Python MATLAB
Branch: master
Clone or download

Latest commit

Fetching latest commit…
Cannot retrieve the latest commit at this time.

Files

Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
dense_estimation Update distributions.py May 7, 2018
.gitignore Initial commit May 7, 2018
LICENSE Create LICENSE May 8, 2018
README.md Add licence note May 8, 2018
nyud_raw_train_to_npy.py Initial commit May 7, 2018
nyud_test_to_npy.py Initial commit May 7, 2018
preview_dataset.py
process_raw.m Initial commit May 7, 2018
test.py Update test.py May 7, 2018
test_laina.py Initial commit May 7, 2018
train.py Update train.py May 7, 2018

README.md

Monocular Depth Prediction

This repository contains a unofficial PyTorch implementation of a monocular depth prediction model described in "Deeper Depth Prediction with Fully Convolutional Residual Networks" by Iro Laina and others. For the official models, see the FCRN-DepthPrediction repository. This implementation supports data pre-processing, training from scratch, and evaluation. The code currently only supports the NYU Depth v2 dataset, but it should be easy to add other datasets.

Note that there is some code to support uncertainty (variance) prediction, however there are some dependencies missing from this repo and i didn't have time to document this. You don't need to worry about this code and can always leave the --dist argument set to '' to use the code for standard depth prediction.

TODO

  • upload evaluation performance numbers on NYU Depth
  • document test.py script

License

This project is licensed under the MIT License (refer to the LICENSE file for details).

Setup (Python 3)

Install prerequisites

  • install pytorch
  • install tensorflow (for tensorboard visualization only - no gpu support required). The easiest way is to run pip install tensorflow.
  • install other python packages: pip install scipy matplotlib h5py
  • install matlab (the pre-processing script depends on the NYU Depth v2 matlab toolbox)

Prepare datasets

  • python nyud_test_to_npy.py (modify the paths in that file to point to correct dirs)
  • download the NYU Depth v2 raw dataset (~400GB) and the toolbox from https://cs.nyu.edu/~silberman/datasets/nyu_depth_v2.html.
  • generate training dataset with matlab - see process_raw.m
  • python nyud_raw_train_to_npy.py (modify the paths in that file to point to correct dirs)
  • modify raw_root in train.py and test.py to point to correct dir

Usage examples

Train and view results

  • python train.py --ex my_test
  • tensorboard logdir=log/my_test
  • open localhost:6006 in a browser

Continue training from checkpoint

Checkpoints are stored after each epoch.

  • python train.py --ex my_test --epochs 80 --lr 0.01
  • python train.py --ex my_test --epochs 50 --lr 0.003

View all training options

  • python train.py --help
You can’t perform that action at this time.