You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry for the late answer, I was very busy some time ago so I didn't come to answer the question.
You can add a decision to change the way the loss is calculated when the number of instances is 0.
For example:
if len(pred_polys) == 0:
loss = torch.sum(pred_polys)
else:
loss = smooth_l1(pred_polys, target_polys)
Hello, if the number of instances in one batch is zero, the smooth L1 loss(Linit, Lcoarse, Liter) will be 'nan'. How can I address this problem?
The text was updated successfully, but these errors were encountered: