Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

predicted mask format #21

Closed
Feanor007 opened this issue Apr 1, 2022 · 3 comments
Closed

predicted mask format #21

Feanor007 opened this issue Apr 1, 2022 · 3 comments

Comments

@Feanor007
Copy link

Feanor007 commented Apr 1, 2022

During training, should I convert the output from the Unet to a binary mask prior feeding into soft_dice_cldice? I used raw logits, but the training result is really bad.
However when I tried to convert to a binary mask, I lost the grad on the tensor.

@jocpae
Copy link
Owner

jocpae commented Apr 5, 2022

Hi, thanks for reaching out. To make it short, you should not convert the output to a binary mask. Did you train using only the dice loss as well (alpha=0? What do your results look like in this case? :)

@Feanor007
Copy link
Author

@jocpae Thanks for replying. I think I fixed this by just only applying sigmoid to the logits. You are right, I shouldn't convert the output to only 0 and 1. I trained with alpha = 0.5. I found that the results are better when combine soft_dice_cldice with other loss such as bce

@jocpae
Copy link
Owner

jocpae commented Apr 5, 2022

Good to hear :)
I do not know what kind of data you are working on but in our experience, it was always a good idea to test a wide range of alpha values empirically to find out what best suits your dataset. I am happy to comment on visual results if you would like to share them :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants