Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Prob-DA Source UNet Parameters #115

Merged
merged 2 commits into from
Mar 28, 2023

Conversation

anwai98
Copy link
Contributor

@anwai98 anwai98 commented Mar 28, 2023

  • updated the patch shape : (512, 512) -> (256,256)
  • added save_root argument in the evaluate_source_model function to route the checkpoints from desired path

@@ -212,11 +215,11 @@ def _parse_aug(aug):
return loader


def get_supervised_loader(args, split, cell_type):
patch_shape = (512, 512)
def get_supervised_loader(args, split, cell_type, batch_size):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The supervised loader is used in unet_adamt as well, so you need to update the code there as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated it just now in the unet_adamt.py training script.

@@ -212,11 +215,11 @@ def _parse_aug(aug):
return loader


def get_supervised_loader(args, split, cell_type):
patch_shape = (512, 512)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's change this in the unsupervised loader as well.

Copy link
Contributor Author

@anwai98 anwai98 Mar 28, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated this as well (however, I am now curious to see how the joint training setup works with more context - (512, 512))

@constantinpape constantinpape merged commit 42c5228 into constantinpape:main Mar 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants