-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Typo in the Eq 5? #2
Comments
You are right. |
Thanks for the code. I like the idea:) :) Now that I'm looking at the code there is no beta implemented in the code. I was wondering if you removed beta in this provided code. I'm trying to reproduce your result that you submitted to the benchmark. Please let me know if something is missing for that. If you don't mind here is some inconsistencies I noticed if you wanna edit your paper: and also I think n3 should be passed to a separate sigmoid (not tanh) and then n1,n2 (after tanh) and n3 (after sigmoid) be passed to L2 normalization. Nice trick to avoid minus of the normals btw :) You just need to update the figure I think. Thanks, |
Hello, nice to hearing from you again. =) |
Closing this issue due to inactivity |
Hi, is the model submitted to the KITTI online benchmark trained on the official " annotated depth maps data set (14 GB)", which only contains 43000 rgb images. |
|
@luohongcheng Yes. The submitted model was finetued using only the official ground truth depth maps with the backbone network pretrained using ImageNet. |
Would you please share your file lists selected from the official training set? Thanks. |
Dear Authur,
Thank you for sharing your codes.
I have a question about equation 5:
Do you meant h= beta x logy - beta x log y* ?
The way you described in the paper, betas are going to cancel out each other and have no effect!
Please fix me if I am wrong.
Thanks,
Ali
The text was updated successfully, but these errors were encountered: