You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your work ! when i read the code of losses.py, i find the variable disable_torch_grad_focal_loss in AsymmetricLoss, but i don't find it appear in elsewhere . so i wonder if the disable_torch_grad_focal_loss is just used for paper's experiment comparing ? and in normal use,do we need to modify it ?
The text was updated successfully, but these errors were encountered:
This is an option we experimented with a bit, but didnt have enough room to put in the paper. The gradient analysis with this option is a bit different.
try first the default version (without).
than you can compare results to the one with disable_torch_grad_focal_loss==True, i think in some cases it can improve results.
Thank you for your work ! when i read the code of losses.py, i find the variable disable_torch_grad_focal_loss in AsymmetricLoss, but i don't find it appear in elsewhere . so i wonder if the disable_torch_grad_focal_loss is just used for paper's experiment comparing ? and in normal use,do we need to modify it ?
The text was updated successfully, but these errors were encountered: