Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add distance based loss #178

Merged
merged 4 commits into from
Dec 6, 2023
Merged

Add distance based loss #178

merged 4 commits into from
Dec 6, 2023

Conversation

constantinpape
Copy link
Owner

No description provided.

@constantinpape constantinpape marked this pull request as ready for review December 6, 2023 20:03
Copy link
Owner Author

@constantinpape constantinpape left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# Because it always interprets the first axis as channel,
# and treats it differently (sums over it independently).
# This will lead to a very large dice loss that dominates over everything else.
fg_input, fg_target = input_[:, 0:1], target[:, 0:1]
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is crucial, if we don't keep the channels then the dice loss does wrong things.

return overall_loss


class DiceBasedDistanceLoss(DistanceLoss):
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I by accident tried this (using dice loss both for the foreground and distance loss term) and this seems to work really good. We should give it a try.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, noted!

@@ -27,7 +27,8 @@ def __init__(
out_channels=1,
use_sam_stats=False,
use_mae_stats=False,
encoder_checkpoint_path=None
encoder_checkpoint_path=None,
final_activation=None,
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed, it's not a good idea to hard-code the final activation to sigmoid. I updated this to use the same logic and syntax as in our U-Net implementation.

@constantinpape constantinpape merged commit 8319f2f into main Dec 6, 2023
4 checks passed
@constantinpape constantinpape deleted the distance-loss branch December 6, 2023 20:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants