New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Got a very low MIoU after simply swapping out the cross entropy loss for "lovasz_softmax" #20
Comments
There can be some hyperparameters to adjust when optimizing with LL alone (compared to e.g. LL + lovasz_softmax), but 0.003 mIoU clearly points to a problem somewhere.
|
The Lovasz-Softmax approximates the IoU computed over 1 batch and not computed over the whole dataset, this can be problematic for small batches, especially for classes absent in the current batch. In practice this can lead to your output converging towards a prediction of only the background, or not converging. Only_present only counts the IoU over the classes present in the batches which mitigates this. |
@bermanmaxim how would you implement this for the binary case, where the option is only to set |
What is the usage of |
Hello, nice to read this paper. I have encountered the problem that I got a very low miou(0.003) from Deeplabv3+ with Lovasz_softmax. It can normally achieve miou=76% using cross entropy loss.
Environment:
pytorch 1.0
Ubuntu 16.04
batch size: 10
dataset: Pascal VOC 2012 (aug)
loaded ImageNet pretrained ResNet-101 weight
And here is the code of Lovasz softmax:
Thanks!
The text was updated successfully, but these errors were encountered: