Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions docs/source/apps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,11 @@ Clara MMARs
.. automodule:: monai.apps.pathology.transforms.spatial.array
.. autoclass:: SplitOnGrid
:members:
.. autoclass:: TileOnGrid
:members:

.. automodule:: monai.apps.pathology.transforms.spatial.dictionary
.. autoclass:: SplitOnGridd
:members:
.. autoclass:: TileOnGridd
:members:
1 change: 1 addition & 0 deletions docs/source/data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -208,6 +208,7 @@ Decathlon Datalist
.. autofunction:: monai.data.load_decathlon_datalist
.. autofunction:: monai.data.load_decathlon_properties
.. autofunction:: monai.data.check_missing_files
.. autofunction:: monai.data.create_cross_validation_datalist


DataLoader
Expand Down
6 changes: 5 additions & 1 deletion docs/source/engines.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ Multi-GPU data parallel
.. automodule:: monai.engines.multi_gpu_supervised_trainer
:members:


Workflows
---------

Expand Down Expand Up @@ -56,3 +55,8 @@ Workflows
~~~~~~~~~~~~~~~~~~~
.. autoclass:: EnsembleEvaluator
:members:

Utilities
---------
.. automodule:: monai.engines.utils
:members:
9 changes: 9 additions & 0 deletions docs/source/metrics.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ Metrics

`FROC`
------
.. autofunction:: compute_fp_tp_probs
.. autofunction:: compute_froc_curve_data
.. autofunction:: compute_froc_score

`Metric`
Expand Down Expand Up @@ -47,13 +49,15 @@ Metrics
`Confusion matrix`
------------------
.. autofunction:: get_confusion_matrix
.. autofunction:: compute_confusion_matrix_metric

.. autoclass:: ConfusionMatrixMetric
:members:

`Hausdorff distance`
--------------------
.. autofunction:: compute_hausdorff_distance
.. autofunction:: compute_percent_hausdorff_distance

.. autoclass:: HausdorffDistanceMetric
:members:
Expand Down Expand Up @@ -89,3 +93,8 @@ Metrics
--------------------
.. autoclass:: CumulativeAverage
:members:

Utilities
---------
.. automodule:: monai.metrics.utils
:members:
26 changes: 24 additions & 2 deletions docs/source/networks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Blocks
:members:

`CRF`
~~~~~~~~~~~~~
~~~~~
.. autoclass:: CRF
:members:

Expand Down Expand Up @@ -73,6 +73,8 @@ Blocks
:members:
.. autoclass:: UnetUpBlock
:members:
.. autoclass:: UnetOutBlock
:members:

`SegResnet Block`
~~~~~~~~~~~~~~~~~
Expand Down Expand Up @@ -254,6 +256,11 @@ Layers
.. automodule:: monai.networks.layers.Conv
:members:

`Pad`
~~~~~
.. automodule:: monai.networks.layers.Pad
:members:

`Pool`
~~~~~~
.. automodule:: monai.networks.layers.Pool
Expand Down Expand Up @@ -300,7 +307,7 @@ Layers
:members:

`PHLFilter`
~~~~~~~~~~~~~~~~~
~~~~~~~~~~~
.. autoclass:: PHLFilter

`GaussianMixtureModel`
Expand Down Expand Up @@ -386,6 +393,11 @@ Nets
.. autoclass:: EfficientNet
:members:

`BlockArgs`
~~~~~~~~~~~
.. autoclass:: BlockArgs
:members:

`EfficientNetBN`
~~~~~~~~~~~~~~~~
.. autoclass:: EfficientNetBN
Expand All @@ -406,6 +418,11 @@ Nets
.. autoclass:: SegResNetVAE
:members:

`ResNet`
~~~~~~~~
.. autoclass:: ResNet
:members:

`SENet`
~~~~~~~
.. autoclass:: SENet
Expand Down Expand Up @@ -513,6 +530,11 @@ Nets
.. autoclass:: FullyConnectedNet
:members:

`VarFullyConnectedNet`
~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: VarFullyConnectedNet
:members:

`Generator`
~~~~~~~~~~~
.. autoclass:: Generator
Expand Down
20 changes: 20 additions & 0 deletions docs/source/optimizers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ Optimizers
==========
.. currentmodule:: monai.optimizers

`LearningRateFinder`
--------------------
.. autoclass:: LearningRateFinder
:members:

`Novograd`
----------
.. autoclass:: Novograd
Expand All @@ -14,3 +19,18 @@ Optimizers
`Generate parameter groups`
---------------------------
.. autofunction:: generate_param_groups

`ExponentialLR`
---------------
.. autoclass:: ExponentialLR
:members:

`LinearLR`
----------
.. autoclass:: LinearLR
:members:

`WarmupCosineSchedule`
----------------------
.. autoclass:: WarmupCosineSchedule
:members:
34 changes: 34 additions & 0 deletions docs/source/transforms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,11 @@ Generic Interfaces
.. autoclass:: InvertibleTransform
:members:

`TraceableTransform`
^^^^^^^^^^^^^^^^^^^^
.. autoclass:: TraceableTransform
:members:

`BatchInverseTransform`
^^^^^^^^^^^^^^^^^^^^^^^
.. autoclass:: BatchInverseTransform
Expand All @@ -64,6 +69,12 @@ Vanilla Transforms
Crop and Pad
^^^^^^^^^^^^

`PadListDataCollate`
""""""""""""""""""""
.. autoclass:: PadListDataCollate
:members:
:special-members: __call__

`Pad`
"""""
.. autoclass:: Pad
Expand Down Expand Up @@ -395,6 +406,12 @@ Intensity
:members:
:special-members: __call__

`RandRicianNoise`
"""""""""""""""""
.. autoclass:: RandRicianNoise
:members:
:special-members: __call__

`RandCoarseTransform`
"""""""""""""""""""""
.. autoclass:: RandCoarseTransform
Expand Down Expand Up @@ -830,6 +847,12 @@ Utility
:members:
:special-members: __call__

`RemoveRepeatedChannel`
"""""""""""""""""""""""
.. autoclass:: RemoveRepeatedChannel
:members:
:special-members: __call__

`LabelToMask`
"""""""""""""
.. autoclass:: LabelToMask
Expand Down Expand Up @@ -1568,6 +1591,12 @@ Utility (Dict)
:members:
:special-members: __call__

`ToPIL`
"""""""
.. autoclass:: ToPIL
:members:
:special-members: __call__

`ToCupyd`
"""""""""
.. autoclass:: ToCupyd
Expand Down Expand Up @@ -1710,6 +1739,11 @@ Transform Adaptors
------------------
.. automodule:: monai.transforms.adaptors

`FunctionSignature`
^^^^^^^^^^^^^^^^^^^
.. autoclass:: FunctionSignature
:members:

`adaptor`
^^^^^^^^^
.. autofunction:: monai.transforms.adaptors.adaptor
Expand Down
25 changes: 25 additions & 0 deletions docs/source/utils.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,3 +51,28 @@ Type conversion
---------------
.. automodule:: monai.utils.type_conversion
:members:

Decorators
----------
.. automodule:: monai.utils.decorators
:members:

Distributed Data Parallel
-------------------------
.. automodule:: monai.utils.dist
:members:

Enums
-----
.. automodule:: monai.utils.enums
:members:

Jupyter Utilities
-----------------
.. automodule:: monai.utils.jupyter_utils
:members:

State Cacher
------------
.. automodule:: monai.utils.state_cacher
:members:
5 changes: 5 additions & 0 deletions docs/source/visualize.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,3 +24,8 @@ Occlusion sensitivity

.. automodule:: monai.visualize.occlusion_sensitivity
:members:

Utilities
---------
.. automodule:: monai.visualize.utils
:members:
4 changes: 2 additions & 2 deletions monai/engines/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -166,9 +166,9 @@ class PrepareBatchExtraInput(PrepareBatch):

Args:
extra_keys: if a string or list provided, every item is the key of extra data in current batch,
and will pass the extra data to the network(*args) in order.
and will pass the extra data to the `network(*args)` in order.
If a dictionary is provided, every `{k, v}` pair is the key of extra data in current batch,
`k` the param name in network, `v` is the key of extra data in current batch,
`k` is the param name in network, `v` is the key of extra data in current batch,
and will pass the `{k1: batch[v1], k2: batch[v2], ...}` as kwargs to the network.

"""
Expand Down
2 changes: 2 additions & 0 deletions monai/metrics/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,13 +28,15 @@
def ignore_background(y_pred: Union[np.ndarray, torch.Tensor], y: Union[np.ndarray, torch.Tensor]):
"""
This function is used to remove background (the first channel) for `y_pred` and `y`.

Args:
y_pred: predictions. As for classification tasks,
`y_pred` should has the shape [BN] where N is larger than 1. As for segmentation tasks,
the shape should be [BNHW] or [BNHWD].
y: ground truth, the first dim is batch.

"""

y = y[:, 1:] if y.shape[1] > 1 else y
y_pred = y_pred[:, 1:] if y_pred.shape[1] > 1 else y_pred
return y_pred, y
Expand Down
1 change: 1 addition & 0 deletions monai/networks/nets/resnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -154,6 +154,7 @@ class ResNet(nn.Module):
ResNet based on: `Deep Residual Learning for Image Recognition <https://arxiv.org/pdf/1512.03385.pdf>`_
and `Can Spatiotemporal 3D CNNs Retrace the History of 2D CNNs and ImageNet? <https://arxiv.org/pdf/1711.09577.pdf>`_.
Adapted from `<https://github.com/kenshohara/3D-ResNets-PyTorch/tree/master/models>`_.

Args:
block: which ResNet block to use, either Basic or Bottleneck.
layers: how many layers to use.
Expand Down
1 change: 1 addition & 0 deletions monai/optimizers/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,5 +10,6 @@
# limitations under the License.

from .lr_finder import LearningRateFinder
from .lr_scheduler import ExponentialLR, LinearLR, WarmupCosineSchedule
from .novograd import Novograd
from .utils import generate_param_groups
32 changes: 16 additions & 16 deletions monai/optimizers/lr_finder.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,30 +146,30 @@ class LearningRateFinder:
and what is the optimal learning rate.

Example (fastai approach):
>>> lr_finder = LearningRateFinder(net, optimizer, criterion)
>>> lr_finder.range_test(data_loader, end_lr=100, num_iter=100)
>>> lr_finder.get_steepest_gradient()
>>> lr_finder.plot() # to inspect the loss-learning rate graph
>>> lr_finder = LearningRateFinder(net, optimizer, criterion)
>>> lr_finder.range_test(data_loader, end_lr=100, num_iter=100)
>>> lr_finder.get_steepest_gradient()
>>> lr_finder.plot() # to inspect the loss-learning rate graph

Example (Leslie Smith's approach):
>>> lr_finder = LearningRateFinder(net, optimizer, criterion)
>>> lr_finder.range_test(train_loader, val_loader=val_loader, end_lr=1, num_iter=100, step_mode="linear")
>>> lr_finder = LearningRateFinder(net, optimizer, criterion)
>>> lr_finder.range_test(train_loader, val_loader=val_loader, end_lr=1, num_iter=100, step_mode="linear")

Gradient accumulation is supported; example:
>>> train_data = ... # prepared dataset
>>> desired_bs, real_bs = 32, 4 # batch size
>>> accumulation_steps = desired_bs // real_bs # required steps for accumulation
>>> data_loader = torch.utils.data.DataLoader(train_data, batch_size=real_bs, shuffle=True)
>>> acc_lr_finder = LearningRateFinder(net, optimizer, criterion)
>>> acc_lr_finder.range_test(data_loader, end_lr=10, num_iter=100, accumulation_steps=accumulation_steps)
>>> train_data = ... # prepared dataset
>>> desired_bs, real_bs = 32, 4 # batch size
>>> accumulation_steps = desired_bs // real_bs # required steps for accumulation
>>> data_loader = torch.utils.data.DataLoader(train_data, batch_size=real_bs, shuffle=True)
>>> acc_lr_finder = LearningRateFinder(net, optimizer, criterion)
>>> acc_lr_finder.range_test(data_loader, end_lr=10, num_iter=100, accumulation_steps=accumulation_steps)

By default, image will be extracted from data loader with x["image"] and x[0], depending on whether
batch data is a dictionary or not (and similar behaviour for extracting the label). If your data loader
returns something other than this, pass a callable function to extract it, e.g.:
>>> image_extractor = lambda x: x["input"]
>>> label_extractor = lambda x: x[100]
>>> lr_finder = LearningRateFinder(net, optimizer, criterion)
>>> lr_finder.range_test(train_loader, val_loader, image_extractor, label_extractor)
>>> image_extractor = lambda x: x["input"]
>>> label_extractor = lambda x: x[100]
>>> lr_finder = LearningRateFinder(net, optimizer, criterion)
>>> lr_finder.range_test(train_loader, val_loader, image_extractor, label_extractor)

References:
Modified from: https://github.com/davidtvs/pytorch-lr-finder.
Expand Down
8 changes: 4 additions & 4 deletions monai/transforms/intensity/array.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,10 +129,10 @@ class RandRicianNoise(RandomizableTransform):
"""
Add Rician noise to image.
Rician noise in MRI is the result of performing a magnitude operation on complex
data with Gaussian noise of the same variance in both channels, as described in `Noise in Magnitude
Magnetic Resonance Images <https://doi.org/10.1002/cmr.a.20124>`_. This transform is adapted from
`DIPY<https://github.com/dipy/dipy>`_. See also: `The rician distribution of noisy mri data
<https://doi.org/10.1002/mrm.1910340618>`_.
data with Gaussian noise of the same variance in both channels, as described in
`Noise in Magnitude Magnetic Resonance Images <https://doi.org/10.1002/cmr.a.20124>`_.
This transform is adapted from `DIPY <https://github.com/dipy/dipy>`_.
See also: `The rician distribution of noisy mri data <https://doi.org/10.1002/mrm.1910340618>`_.

Args:
prob: Probability to add Rician noise.
Expand Down