Skip to content
forked from isl-org/MiDaS

Code for robust monocular depth estimation described in "Lasinger et. al., Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-Shot Cross-Dataset Transfer, arXiv:1907.01341"

License

Notifications You must be signed in to change notification settings

BingjieTang/MiDaS

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-Shot Cross-Dataset Transfer

This repository contains code to compute depth from a single image. It accompanies our paper:

Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-Shot Cross-Dataset Transfer
Katrin Lasinger, Rene Ranftl, Konrad Schindler, Vladlen Koltun

The pre-trained model corresponds to RW+MD+MV with MGDA enabled and movies sampled at 4 frames per second.

Setup

  1. Download the model weights model.pt and place the file in the root folder.

  2. Setup dependencies:

    conda install pytorch torchvision opencv

    The code was tested with Python 3.7, PyTorch 1.0.1, and OpenCV 3.4.2.

Usage

  1. Place one or more input images in the folder input.

  2. Run the model:

    python run.py
  3. The resulting depth maps are written to the output folder.

Citation

Please cite our paper if you use this code in your research:

@article{Lasinger2019,
	author    = {Katrin Lasinger and Ren\'{e} Ranftl and Konrad Schindler and Vladlen Koltun},
	title     = {Towards Robust Monocular Depth Estimation: Mixing Datasets for 
        Zero-Shot Cross-Dataset Transfer},
	journal   = {arXiv:1907.01341},
	year      = {2019},
}

License

MIT License

About

Code for robust monocular depth estimation described in "Lasinger et. al., Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-Shot Cross-Dataset Transfer, arXiv:1907.01341"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%