PyTorch implementation of our paper (2024 ICRA):
GAM-Depth: Self-Supervised Indoor Depth Estimation Leveraging a Gradient-Aware Mask and Semantic Constraints
Anqi Cheng, Zhiyuan Yang, Haiyue Zhu, Kezhi Mao
Step1: Creating a virtual environment
conda create -n gam_depth python=3.6
conda activate gam_depth
conda install pytorch==1.10.1 torchvision==0.11.2 torchaudio==0.10.1 cudatoolkit=11.3 -c pytorch -c conda-forge
Step2: Downloading the modified scikit_image package following StructDepth
unzip scikit-image-0.17.2.zip
cd scikit-image-0.17.2
python setup.py build_ext -i
pip install -e .
Step3: Installing other packages
pip install -r requirements.txt
Please download pretrained models and unzip them to MODEL_PATH
python inference_single_image.py --image_path=/home/image_path --load_weights_folder=MODEL_PATH
Please download test datasets of NYUv2, ScanNet, and InteriorNet and unzip them to VAL_PATH
Modify the evaluation script in eval.sh to evaluate NYUv2/ScanNet/InteriorNet depth separately
python evaluation/nyuv2_eval_depth.py \
--data_path VAL_PATH \
--load_weights_folder MODEL_PATH \
The raw NYU dataset is about 400G and has 590 videos. You can download the raw datasets from there
Please download main directions with a random flip and proxy semantic labels from there and unzip them to VPS_PATH and SEG_PATH
Proxy semantic labels are generated from Light-Weight RefineNet
Modify the training script train.sh for PATH or different trainning settings.
python train.py \
--data_path DATA_PATH \
--val_path VAL_PATH \
--vps_path VPS_PATH \
--seg_path SEG_PATH \
--log_dir ../logs/ \
--model_name 1 \
--batch_size 4 \
--num_epochs 50 \
--start_epoch 0 \
--load_weights_folder MODEL_PATH/pretrain/ \
--using_GAM GAM \
--using_seg
We borrowed a lot of codes from scikit-image, monodepth2, P2Net, StructDepth, Light-Weight RefineNet, and LEGO. Thanks for their excellent works!
@article{cheng2024gam,
title={GAM-Depth: Self-Supervised Indoor Depth Estimation Leveraging a Gradient-Aware Mask and Semantic Constraints},
author={Cheng, Anqi and Yang, Zhiyuan and Zhu, Haiyue and Mao, Kezhi},
journal={arXiv preprint arXiv:2402.14354},
year={2024}
}