-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No problems with all losses, except JaccardLoss #884
Comments
Hi @JonasZaoui, thanks for the issue! |
Hi @qubvel, thanks for your answer.
My hypothesis is that the model optimises itself on the very majority class, the background, and predicts 0 all the time. When I use class = [1,2] on the other hand, I get predictions for class 1 (but not for class 2). So it's weird... |
Ok, thanks, I quickly checked the code but did not find any obvious issue. I will try to investigate it further.. Just in case you identify the issue, please let me know! |
Of course! Thank you for your time. |
Do you think that's if i ignore background in iou computation, it can fix the problem ? (Or increase performance for imbalanced binary segmentation) |
That might help! Or you can consider weights for classes |
I have only one class (0 for background, and 1 for the class). When i ignore the class 0, my dice loss is null. When i use weighted bce it's okay. Never mind the iou :) |
If you have only a background and one class, your case can be considered as binary segmentation. There is an example notebook with dice loss for such a case. |
Hi !
A quick question about a loss, I've used quite a few losses and loss mixtures without any problems, but when I use iou loss, I only get empty predictions. My ground truth is a multiclass mask, without encoding labels on channels
loss = smp.losses.JaccardLoss(mode='multiclass', from_logits=True)
However, with the other losses, I'm fine with multiclasses, and with logits too.
The text was updated successfully, but these errors were encountered: