New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New losses: focal loss & generalised dice loss #46
Conversation
Focal Loss implementation: works in log space, to be numerically stable. I first tried without but: very easily got NaNs when training. Basically, it boosts the loss for the cases when objects are not detected correctly --> avoid FN predictions (ref). In addition, the model effectively incorporates the small objects since the loss for these objects is very high. |
Display Dice Loss results in the terminal while using a new loss --> as reference / safety check, since we have a good idea of the ideal Dice loss curve:
|
As the
And then, while running
|
Generalised Dice Loss: Adapted from the original paper (multi-class segmentation) for our task (ie binary segmentation). |
…maging into cg/new-losses
Mixed Loss: Combination of the focal loss and the log of the dice loss. The Log is also used here to boost the loss when objects are not detected correctly --> dice close or equal to zero. To bring the two losses to a similar scale, a new hyperparameter is introduced:
|
Soft Dice Loss: implemented to allow the use of Dice Loss during mixup experiments (not binary masks). |
@olix86: PR ready! Could I ask you to review it? Thanks :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything looks good to me, I just have a small doubt about the implementation of the new dice score, let me know what you think :)
Edit : should be fixed by the following commit |
ValueError: Shape mismatch: im1 and im2 must have the same shape.
Co-Authored-By: olix86 <olix86@users.noreply.github.com>
Just adding new subjects to the config file, recently added by @alexfoias to |
Maybe ignore the dice_loss values if there is no segmented object (instead of returning one) |
Looks like the last commit that added new contrasts broke something, I think it's because they were not added to
|
Alright this should fix it but I'm not super knowledgeable in MRI contrasts so somebody should probably check if it makes sense :) |
Is there a reason why there's the following line to prevent using both FiLM and mixup at the same time? (main.py line 61) |
The reason was: If we used FiLM and MixUp at the same time: how do you apply mixup to the metadata (ie FiLM input)? Let's say we do a mixup between a T1w and a T2star: ideally we would like to do a mixup on their metadata as well when feeding the FiLM generator. |
Checked! All good 👍 |
Done by:
|
@olix86: That's ready! Could you please review when suits you? |
@charleygros looks good to me, I think the only thing missing was adding the new contrasts from your last commit to the json file (I found 4 and added them). If that's ok with you I think we're ready to merge 😄 |
We are facing a severe class-imbalance issue since the introduction of the
MS lesion segmentation task
. This PR allows the use of new loss functions:Done:
dice_score
to be robust to "both empty" issue: Fixes RuntimeWarning: invalid value encountered in double_scalars #52.mixup
experiments: details