Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

differentiable transforms #189

Closed
mhubii opened this issue Mar 18, 2020 · 9 comments
Closed

differentiable transforms #189

mhubii opened this issue Mar 18, 2020 · 9 comments
Labels
Feature request WG: Transforms For the transforms working group

Comments

@mhubii
Copy link
Contributor

mhubii commented Mar 18, 2020

the transforms are not differentiable, are they?

Edit: I was primarily looking at transforms.AffineGrid which generates the affine transformation from numpy arrays and does not allow the autograd mechanism of PyTorch to track the gradient.

Much of what you want to implement is already available here https://github.com/kornia/kornia, including the dice loss, and a good amount of the transforms, but differentiable as well.

Its not even the alpha release yet but reading the docs I had trouble to understand which type of parameter is being passed. Often there is just an explanation what it stands for rather than what data type it actually is. Makes it hard to use the functions

@tvercaut
Copy link
Member

This relates to #95 as differentiable spatial transformations are required for most learning based registration approaches.

@tvercaut
Copy link
Member

At first look, kornia seems focused/restricted to 2D images.

@wyli
Copy link
Contributor

wyli commented Mar 20, 2020

this feature is almost there, numeric logic is in place, now we need to make decision whether we want to make these trainable layers a part of network layer module or transform module

@mhubii
Copy link
Contributor Author

mhubii commented Mar 20, 2020

At first look, kornia seems focused/restricted to 2D images.

Fair point but it works on RGB, should work similarly on 3D as well. However, it does not support more complex transforms

@mhubii
Copy link
Contributor Author

mhubii commented Mar 21, 2020

this feature is almost there, numeric logic is in place, now we need to make decision whether we want to make these trainable layers a part of network layer module or transform module

The transform module is meant to be pre-processing step in monai, right? We got the torch.Dataset which takes a compose of these transforms. By design, the monai.transforms take a dict. PyTorch shouldnt really care whether we extend the torch.nn.Module or invoke calls on monai.transform, but building a network will probably look like a weird mixture of transforms and layers

@tvercaut
Copy link
Member

For the record, it's also good to have a look at how PhoenixDL/rising did it with a common split into functionals and transforms:
https://github.com/PhoenixDL/rising/tree/master/rising/transforms
https://github.com/PhoenixDL/rising/tree/master/rising/transforms/functional

Note that quite a few 3D spatial transforms are already available there.

@wyli
Copy link
Contributor

wyli commented Apr 29, 2020

yes, @mibaumgartner from PhoenixDL/rising is also part of our working group

@mibaumgartner
Copy link
Collaborator

@justusschock
Small side note: rising's functional interface assumes batched tensors and applies them inside the Dataloader. This was necessary to ensure max performance on CPU AND GPU :) By this design decision the transforms could also be added as "normal" network layers. [I think a more detailed discussion about this will also be part of the next working group meeting]

So there might be some changes that need to be done to the transforms (this is a general consideration when subclassing nn.Module which generally also operate on batches).

Also: operating on RGB is not equivalent to volumetric data which would be 4D with channels.

@wyli wyli added this to To-Do in v0.2.0 via automation May 15, 2020
@wyli wyli removed this from To-Do in v0.2.0 Jun 18, 2020
@wyli wyli added Feature request WG: Transforms For the transforms working group labels May 13, 2021
@vikashg
Copy link

vikashg commented Jan 4, 2024

closed

@vikashg vikashg closed this as completed Jan 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature request WG: Transforms For the transforms working group
Projects
None yet
Development

No branches or pull requests

5 participants