You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great paper. I have one question about the class absol, line 31, https://github.com/cvlab-yonsei/DAQ/blob/main/models/resnet20_DAQ.py,
I don't understand your backward function, there seems a lot of redundant operation after grad_input = torch.sign(input). Is there any specific explanation about this backward function.
The text was updated successfully, but these errors were encountered:
I am sorry to confuse you. As you say, it might be unnecessary operation after ''grad_input = torch.sign(input)''. Thank you!
I understand suddenly , for floating point zero, using torch.sign will get 0. However, the derivative at zero for abs function is not differentiable. using this backward function, we will get sign 1 for number 0. I worry about my guess is correct or not.
Thanks for your great paper. I have one question about the class
absol
, line 31, https://github.com/cvlab-yonsei/DAQ/blob/main/models/resnet20_DAQ.py,I don't understand your backward function, there seems a lot of redundant operation after
grad_input = torch.sign(input)
. Is there any specific explanation about this backward function.The text was updated successfully, but these errors were encountered: