Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add 2D Probabilistic UNet #119

Merged
merged 5 commits into from
Apr 11, 2023
Merged

Add 2D Probabilistic UNet #119

merged 5 commits into from
Apr 11, 2023

Conversation

anwai98
Copy link
Contributor

@anwai98 anwai98 commented Apr 6, 2023

  • Add Probabilistic UNet with torch_em.model.UNet2d backbone

Copy link
Owner

@constantinpape constantinpape left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I just had a brief initial look, and found one obvious thing that needs to be change. I will have a closer look at the rest in the next few days.

from torch_em.model import UNet2d
from torch_em.loss.dice import DiceLossWithLogits

device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't define this globally. Instead pass this as an argument to the functions or arguments that need it.

Copy link
Owner

@constantinpape constantinpape left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had a bit closer look and left a few more comments. In addition to addressing those we should also add a test, see for example https://github.com/constantinpape/torch-em/blob/main/test/model/test_unet.py.

num_classes=None
):

super(Encoder, self).__init__()
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This style of writing the super calls is outdated. It's better to just write super().__init__() instead.

num_classes=None
):

super(AxisAlignedConvGaussian, self).__init__()
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above.

use_tile=True
):

super(Fcomb, self).__init__()
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above.

rl_swap=False
):

super(ProbabilisticUnet, self).__init__()
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above.

"""

def __init__(
self,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a minor thing, but the usual style for these inits is to have only one indent:

def __init__(
    self,
    aa,
    bb,
    cc,
    dd,
    ...
):
   ...

(also relevant in several places below).

return self.last_layer(output)


class ProbabilisticUnet(nn.Module):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's call it ProbabilisticUNet to be consistent with the other UNet naming.

@anwai98
Copy link
Contributor Author

anwai98 commented Apr 10, 2023

Thanks for the feedback. The discussions for naming consistency (#119 (comment)), updating the indent (#119 (comment)) and style of super calls (#119 (comment)) has been updated.

(I'll work on adding a test for ProbabilisticUNet)

Copy link
Owner

@constantinpape constantinpape left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One more minor comment, besides this the changes look good.

torch_em/model/probabilistic_unet.py Outdated Show resolved Hide resolved
Copy link
Owner

@constantinpape constantinpape left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good now!

@constantinpape constantinpape merged commit da3b4c9 into constantinpape:main Apr 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants