Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge genaidev Into Dev #7886

Merged
merged 36 commits into from
Jul 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
fac754d
6676 port generative networks autoencoderkl (#7260)
marksgraham Dec 5, 2023
b3fdfdd
6676 port generative networks vqvae (#7285)
marksgraham Dec 7, 2023
c61c6ac
6676 port generative networks transformer (#7300)
marksgraham Dec 11, 2023
de0a476
6676 port generative networks ddpm (#7304)
marksgraham Dec 12, 2023
43bc023
6676 port generative networks controlnet (#7312)
marksgraham Dec 14, 2023
b85a534
Adds patchgan discriminator (#7319)
marksgraham Dec 14, 2023
aa4a4db
6676 port generative networks spade (#7320)
marksgraham Dec 19, 2023
3447b09
6676 port diffusion schedulers (#7332)
marksgraham Jan 3, 2024
3ab5c62
6676 port diffusion schedulers (#7364)
marksgraham Jan 5, 2024
0a549fe
Adds ordering util (#7369)
marksgraham Jan 8, 2024
510f7bc
6676 port generative inferers (#7379)
marksgraham Jan 18, 2024
41fb3ff
[Attention block] relative positional embedding (#7346)
vgrau98 Jan 18, 2024
f15a173
6676 port generative engines (#7406)
marksgraham Feb 1, 2024
ba188e2
monai generative: refactor autoencoderkl (#7552)
marksgraham Apr 23, 2024
1a57b55
7227 refactor transformer and diffusion model unet (#7715)
marksgraham May 10, 2024
c54bf3c
Tidy up init (#7755)
marksgraham May 13, 2024
a052c44
Only have contigous calls after attention blocks (#7763)
marksgraham May 14, 2024
a423bcd
Neater use off nn.Sequential in controlnet (#7754)
marksgraham May 22, 2024
36511cc
Addition of SPADE Network + tests and modification of SPADE normalis…
virginiafdez Jun 3, 2024
98550c0
Scheduler Clip Fix (#7855)
virginiafdez Jun 21, 2024
15ff663
Merging Dev Into gen-ai-dev and Undeclared Variable Fixes (#7887)
ericspod Jul 2, 2024
95c2200
Resolving conflicts
ericspod Jul 2, 2024
97ebde8
Resolving conflicts
ericspod Jul 2, 2024
72a7fa0
Resolving conflicts
ericspod Jul 2, 2024
1ef263d
Resolving conflicts
ericspod Jul 2, 2024
ec06090
DCO Remediation Commit for Eric Kerfoot <17726042+ericspod@users.nore…
ericspod Jul 2, 2024
d5da737
Further conflict resolutions and replacing regressed changes
ericspod Jul 2, 2024
ff90736
Formatting
ericspod Jul 2, 2024
dbf6538
Merge branch 'dev' into gen-ai-dev
ericspod Jul 9, 2024
7828781
Update to merge changes to SABlock, possibly resolving conflicts betw…
ericspod Jul 10, 2024
54e180d
Formatting
ericspod Jul 10, 2024
b691308
Typing fix
ericspod Jul 17, 2024
2043db2
Merge branch 'dev' into gen-ai-dev
ericspod Jul 17, 2024
4b46cc4
Merge branch 'dev' into gen-ai-dev
ericspod Jul 18, 2024
95c73ec
Minor fix
ericspod Jul 19, 2024
e1d1790
Minor fix
ericspod Jul 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions docs/source/engines.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,11 @@ Workflows
.. autoclass:: GanTrainer
:members:

`AdversarialTrainer`
~~~~~~~~~~~~~~~~~~~~
.. autoclass:: AdversarialTrainer
:members:

`Evaluator`
~~~~~~~~~~~
.. autoclass:: Evaluator
Expand Down
23 changes: 23 additions & 0 deletions docs/source/inferers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,29 @@ Inferers
:members:
:special-members: __call__

`DiffusionInferer`
~~~~~~~~~~~~~~~~~~
.. autoclass:: DiffusionInferer
:members:
:special-members: __call__

`LatentDiffusionInferer`
~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: LatentDiffusionInferer
:members:
:special-members: __call__

`ControlNetDiffusionInferer`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: ControlNetDiffusionInferer
:members:
:special-members: __call__

`ControlNetLatentDiffusionInferer`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: ControlNetLatentDiffusionInferer
:members:
:special-members: __call__

Splitters
---------
Expand Down
5 changes: 5 additions & 0 deletions docs/source/utils.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,3 +81,8 @@ Component store
---------------
.. autoclass:: monai.utils.component_store.ComponentStore
:members:

Ordering
--------
.. automodule:: monai.utils.ordering
:members:
4 changes: 2 additions & 2 deletions monai/apps/detection/utils/anchor_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ def generate_anchors(
w_ratios = 1 / area_scale
h_ratios = area_scale
# if 3d, w:h:d = 1:aspect_ratios[:,0]:aspect_ratios[:,1]
elif self.spatial_dims == 3:
else:
area_scale = torch.pow(aspect_ratios_t[:, 0] * aspect_ratios_t[:, 1], 1 / 3.0)
w_ratios = 1 / area_scale
h_ratios = aspect_ratios_t[:, 0] / area_scale
Expand All @@ -199,7 +199,7 @@ def generate_anchors(
hs = (h_ratios[:, None] * scales_t[None, :]).view(-1)
if self.spatial_dims == 2:
base_anchors = torch.stack([-ws, -hs, ws, hs], dim=1) / 2.0
elif self.spatial_dims == 3:
else: # elif self.spatial_dims == 3:
ds = (d_ratios[:, None] * scales_t[None, :]).view(-1)
base_anchors = torch.stack([-ws, -hs, -ds, ws, hs, ds], dim=1) / 2.0

Expand Down
1 change: 1 addition & 0 deletions monai/apps/pathology/transforms/post/array.py
Original file line number Diff line number Diff line change
Expand Up @@ -379,6 +379,7 @@ def _generate_contour_coord(self, current: np.ndarray, previous: np.ndarray) ->
"""

p_delta = (current[0] - previous[0], current[1] - previous[1])
row, col = -1, -1
KumoLiu marked this conversation as resolved.
Show resolved Hide resolved

if p_delta in ((0.0, 1.0), (0.5, 0.5), (1.0, 0.0)):
row = int(current[0] + 0.5)
Expand Down
1 change: 1 addition & 0 deletions monai/bundle/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -221,6 +221,7 @@ def load_bundle_config(bundle_path: str, *config_names: str, **load_kw_args: Any
raise ValueError(f"Cannot find config file '{full_cname}'")

ardata = archive.read(full_cname)
cdata = {}
KumoLiu marked this conversation as resolved.
Show resolved Hide resolved

if full_cname.lower().endswith("json"):
cdata = json.loads(ardata, **load_kw_args)
Expand Down
1 change: 1 addition & 0 deletions monai/data/dataset_summary.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,7 @@ def collect_meta_data(self):
"""

for data in self.data_loader:
meta_dict = {}
if isinstance(data[self.image_key], MetaTensor):
meta_dict = data[self.image_key].meta
elif self.meta_key in data:
Expand Down
15 changes: 9 additions & 6 deletions monai/data/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,10 +53,6 @@
pytorch_after,
)

if pytorch_after(1, 13):
# import private code for reuse purposes, comment in case things break in the future
from torch.utils.data._utils.collate import collate_tensor_fn, default_collate_fn_map

pd, _ = optional_import("pandas")
DataFrame, _ = optional_import("pandas", name="DataFrame")
nib, _ = optional_import("nibabel")
Expand Down Expand Up @@ -454,8 +450,13 @@ def collate_meta_tensor_fn(batch, *, collate_fn_map=None):
Collate a sequence of meta tensor into a single batched metatensor. This is called by `collage_meta_tensor`
and so should not be used as a collate function directly in dataloaders.
"""
collate_fn = collate_tensor_fn if pytorch_after(1, 13) else default_collate
collated = collate_fn(batch) # type: ignore
if pytorch_after(1, 13):
from torch.utils.data._utils.collate import collate_tensor_fn # imported here for pylint/mypy issues

collated = collate_tensor_fn(batch)
else:
collated = default_collate(batch)

meta_dicts = [i.meta or TraceKeys.NONE for i in batch]
common_ = set.intersection(*[set(d.keys()) for d in meta_dicts if isinstance(d, dict)])
if common_:
Expand Down Expand Up @@ -496,6 +497,8 @@ def list_data_collate(batch: Sequence):

if pytorch_after(1, 13):
# needs to go here to avoid circular import
from torch.utils.data._utils.collate import default_collate_fn_map

from monai.data.meta_tensor import MetaTensor

default_collate_fn_map.update({MetaTensor: collate_meta_tensor_fn})
Expand Down
4 changes: 3 additions & 1 deletion monai/engines/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,14 @@
from __future__ import annotations

from .evaluator import EnsembleEvaluator, Evaluator, SupervisedEvaluator
from .trainer import GanTrainer, SupervisedTrainer, Trainer
from .trainer import AdversarialTrainer, GanTrainer, SupervisedTrainer, Trainer
from .utils import (
DiffusionPrepareBatch,
IterationEvents,
PrepareBatch,
PrepareBatchDefault,
PrepareBatchExtraInput,
VPredictionPrepareBatch,
default_make_latent,
default_metric_cmp_fn,
default_prepare_batch,
Expand Down
Loading
Loading