You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I am currently using your code for surface normal estimation. I was wondering what is the benefit to calculate gradient(df) in loss function and do output.backward(gradient=df) in training process instead of use loss value do loss.backward()?
Btw, have you ever tried to let the depth branch do refinement for raw depth?
Thank you for your help in advance!
Best regards
Alisa
The text was updated successfully, but these errors were encountered:
Hi Alisa,
backward(df) is intended to zero out the gradient of pixels in the masked region.
I didn't really try to use the depth branch alone for raw depth refinement, but I believe it is also feasible to do so.
If i got it correctly, is that means, this back propagation process(with backward(df)) is different as simply using the scalar loss from loss function and call loss.backward() in the training code? The loss.backward() will not make the masked region with zero gradient.
Sorry, I meant jointly do surface normal estimation and depth refinement or depth estimation. With the relationship between depth and surface normal these two tasks might be able to help each other during the training process. However, for combining theses two tasks i met another issue, i would like to try to use output.backward(df) for back propagation, may i ask, in your opinion would it make difference if i do output_d.backward(df) first or output_sn.backward(df) first?
Hello,
I am currently using your code for surface normal estimation. I was wondering what is the benefit to calculate gradient(df) in loss function and do output.backward(gradient=df) in training process instead of use loss value do loss.backward()?
Btw, have you ever tried to let the depth branch do refinement for raw depth?
Thank you for your help in advance!
Best regards
Alisa
The text was updated successfully, but these errors were encountered: