You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Yes, it's a bug. According to this link: pytorch/pytorch#6394, the backprop of torch.sqrt() would generate nan if the input is zero.
Actually, I'm surprised that Pytorch really puts an inf there. I thought the gradient was hard-coded to some very large but limited value. lol. My computer is now occupied by some other tasks and I cannot run tests. But I will fix this ASAP and let you know.
hi, I've changed that line to num = torch.sqrt( (y2-y1).square() + (x2-x1).square() +1e-14)
the 1e-8 is no more necessary as the sqrt is guaranteed positive. Please let me know if there are further issues.
hi, I've changed that line to num = torch.sqrt( (y2-y1).square() + (x2-x1).square() +1e-14)
the 1e-8 is no more necessary as the sqrt is guaranteed positive. Please let me know if there are further issues.
Hi I have verified the fix work for the backprop. I can confirm that DIoU loss enhance the performance of the detector considerably.
When backprop with GIoU loss, there is a sqrt out of range regarding the line here when sqrt encounter 0 values.
Should we add a small offset to the value inside the sqrt? I tried 1e-8 and the training became unstable while 1e-16 is fine.
num = torch.sqrt( (y2-y1).square() + (x2-x1).square() +1e-16) + 1e-8
The text was updated successfully, but these errors were encountered: