Skip to content

TWJianNuo/EdgeDepth-Release

Repository files navigation

EdgeDepth

This is the reference PyTorch implementation for training and testing depth estimation models using the method described in

The Edge of Depth: Explicit Constraints between Segmentation and Depth

Shengjie Zhu, Garrick Brazil and Xiaoming Liu

CVPR 2020

⚙️ Setup

  1. Compile Morphing operaiton:

    We implement a customized Morphing Operation in our evaluation and training codes. You can still do training and evaluation without it with a sacrifice of performance. To enable it, you can do as follows:

    1. Guranttee your computer's cuda version the same as your pytorch cuda version.

    2. Type:

    cd bnmorph
    python setup.py install
    cd ..

    You should be able to successfully compile it if you can compile cuda codes in this Pytorch Tutorial

  2. Prepare Kitti Data: We use Kitti Raw Dataset as well as predicted semantics label from this Paper.

    1. To download Kitti Raw Data
    wget -i splits/kitti_archives_to_download.txt -P kitti_data/
    1. Use thins Link to download precomputed semantics Label

⏳ Training

Training Code will be released soon.

📊 evaluation

  1. Pretrained Model is available here

  2. Precompute GroundTruth DepthMap

    python export_gt_depth.py --data_path [Your Kitti Raw Data Address] --split eigen
  3. To Evaluate without using Morphing, use command:

    python evaluate_depth.py --split eigen --dataset kitti --data_path [Your Kitti Raw Data Address] --load_weights_folder [Your Model Address] --eval_stereo \
     --num_layers 50 --post_process

    To Evaluate using Morphing, use command:

    python evaluate_depth.py --split eigen --dataset kitti --data_path [Your Kitti Raw Data Address] --load_weights_folder [Your Model Address] --eval_stereo \
     --num_layers 50 --post_process --bnMorphLoss --load_semantics --seman_path [Your Predicted Semantic Label Address]
  4. You should get performance similar to Entry "Ours" listed in the table:

    Method Name Use Lidar Groundtruth? Is morphed? KITTI abs. rel. error delta < 1.25
    BTS Yes No 0.091 0.904
    Depth Hints No No 0.096 0.890
    Ours No No 0.091 0.898
    Ours No Yes 0.090 0.899

🖼 Running on your own images

To run on your own images, run:

python test_simple.py --image_path <your_image_path>
  --model_path <your_model_path>
  --num_layers <18 or 50>

This will save depths as a numpy array (in original resolution), and a colormapped depth and disparity image.

Acknowledgment

Quite a few our code base come from Monodepth2 and Depth Hints

About

Github Repo for Paper "The Edge of Depth: Explicit Constraints between Segmentation and Depth"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published