Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The training results differ from the paper by 33% #36

Open
bizheyandebing opened this issue Apr 9, 2024 · 3 comments
Open

The training results differ from the paper by 33% #36

bizheyandebing opened this issue Apr 9, 2024 · 3 comments

Comments

@bizheyandebing
Copy link

I have completely trained your model, and the difference with the model you provided is close to 33%. Could you please help me check my training files.

image

image
Below is the content of my training file:

--data_path /home/ubuntu/ubuntu_jixie/temp/kitti-raw
--dataset kitti
--model_name res_088
--backbone resnet_lite
--height 192
--width 640
--batch_size 16
--num_epochs 25
--scheduler_step_size 15
--num_layers 50
--num_features 256
--model_dim 32
--patch_size 16
--dim_out 64
--query_nums 64
--min_depth 0.001
--max_depth 80.0
--eval_mono
--post_process

@bizheyandebing
Copy link
Author

Yes, I use res50 as the backbone network, If I set --load_weight_floder, how should its parameters be set?

@bizheyandebing
Copy link
Author

I think this setting should have no effect, because I found that the relevant code of the train.py file (lines 137-138h) has been logged out.

Moreover, there is no mention of additional pre-training in the paper.

@hisfog
Copy link
Owner

hisfog commented Apr 10, 2024

Apologies for the delayed response, For reproducing results on KITTI,please DO NOT use the latest code release. Instead, you can kindly utilize the following version by

git checkout 6a1e997f97caef8de080bb2873f71cfbad9a8047

which is consistent with the implementation of paper SQLdepth, without any additional modifications.
You can refer to #26 (comment), Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants