Skip to content

fatemehkarimii/LightDepth

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

49 Commits
 
 
 
 
 
 

Repository files navigation

image

  • A simple Tensorflow/PyTorch Implementation of the "LightDepth: A Resource Efficient Depth Estimation Approach for Dealing with Ground Truth Sparsity via Curriculum Learning" paper. The paper can be read here.

Results

  • Visual comparison to demonstrate the improvement of our output over DenseNet on KITTI dataset. The left column presents the input and the right column presents sparse ground truth depth maps. image
  • Performance comparisons of state-of-the-art depth estimation models on the KITTI Eigen split dataset\cite{eigen}. The Raspberry Pi 4 device was used for evaluation, considering trainable parameters (Params), Gflops, Runtime, and Battery. Results are reported in millions of parameters, seconds for runtime, and Watt Seconds (WS) for battery. The table highlights the best results in bold. image

Data

  • KITTI: copy the raw data to a folder with the path '../kitti'. Our method expects dense input depth maps, therefore, you need to run a depth inpainting method on the Lidar data.

Training & Evaluation

  • Simply run python main.py.

Pretrained model

  • Pretrained model can ce accessed here

About

LightDepth: A Resource Efficient Depth Estimation Approach for Dealing with Ground Truth Sparsity via Curriculum Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published