Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

would it be possible to replace gradient(calculated from autograd function) backpropagation by scalar loss backpropagation? #4

Open
Alisa-de opened this issue Oct 26, 2020 · 2 comments

Comments

@Alisa-de
Copy link

Hello,
I am currently using your code for surface normal estimation. I was wondering what is the benefit to calculate gradient(df) in loss function and do output.backward(gradient=df) in training process instead of use loss value do loss.backward()?

Btw, have you ever tried to let the depth branch do refinement for raw depth?

Thank you for your help in advance!
Best regards
Alisa

@jzengust
Copy link
Owner

Hi Alisa,
backward(df) is intended to zero out the gradient of pixels in the masked region.
I didn't really try to use the depth branch alone for raw depth refinement, but I believe it is also feasible to do so.

Best regards,
Jin

@Alisa-de
Copy link
Author

Hi Jin,
many thanks for your reply!

If i got it correctly, is that means, this back propagation process(with backward(df)) is different as simply using the scalar loss from loss function and call loss.backward() in the training code? The loss.backward() will not make the masked region with zero gradient.

Sorry, I meant jointly do surface normal estimation and depth refinement or depth estimation. With the relationship between depth and surface normal these two tasks might be able to help each other during the training process. However, for combining theses two tasks i met another issue, i would like to try to use output.backward(df) for back propagation, may i ask, in your opinion would it make difference if i do output_d.backward(df) first or output_sn.backward(df) first?

Best regards,
Alisa

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants