Skip to content

Code of single-view depth prediction algorithm on Internet Photos described in "MegaDepth: Learning Single-View Depth Prediction from Internet Photos, Z. Li and N. Snavely, CVPR 2018".

License

zhengqili/MegaDepth

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
April 24, 2018 17:58
June 17, 2018 12:44
April 24, 2018 17:58
April 24, 2018 17:58
April 24, 2018 17:58
August 27, 2018 12:55
November 10, 2018 22:11
April 24, 2018 17:58
May 10, 2018 13:50
May 10, 2018 13:50
May 10, 2018 13:50
April 24, 2018 17:58
April 24, 2018 17:58

MegaDepth: Learning Single-View Depth Prediction from Internet Photos

This is a code of the algorithm described in "MegaDepth: Learning Single-View Depth Prediction from Internet Photos, Z. Li and N. Snavely, CVPR 2018". The code skeleton is based on "https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix". If you use our code or models for academic purposes, please consider citing:

@inproceedings{MDLi18,
  	title={MegaDepth: Learning Single-View Depth Prediction from Internet Photos},
  	author={Zhengqi Li and Noah Snavely},
  	booktitle={Computer Vision and Pattern Recognition (CVPR)},
  	year={2018}
}

Examples of single-view depth predictions on the photos we randomly downloaded from Internet:

Dependencies:

  • The code was written in Pytorch 0.2 and Python 2.7, but it should be easy to adapt it to Python 3 and latest Pytorch version if needed.
  • You might need skimage, h5py libraries installed for python before running the code.

Single-view depth prediction on any Internet photo:

    python demo.py

You should see an inverse depth prediction saved as demo.png from an original photo demo.jpg. If you want to use RGB maps for visualization, like the figures in our paper, you have to install/run semantic segmentation from https://github.com/kazuto1011/pspnet-pytorch trained on ADE20K to mask out sky, because inconsistent depth prediction of unmasked sky will not make RGB visualization resonable.

Evaluation on the MegaDepth test splits:

    python rmse_error_main.py
  • To compute Structure from Motion Disagreement Rate (SDR), change the variable "dataset_root" in python file "rmse_error_main.py" to the root directory of MegaDepth_v1 folder, and change variable "test_list_dir_l" and "test_list_dir_p" to corresponding folder paths of test lists, and run:
    python SDR_compute.py

About

Code of single-view depth prediction algorithm on Internet Photos described in "MegaDepth: Learning Single-View Depth Prediction from Internet Photos, Z. Li and N. Snavely, CVPR 2018".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages