Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for convolution on N-dimensions #213

Closed
shivance opened this issue Jan 30, 2023 · 4 comments
Closed

Support for convolution on N-dimensions #213

shivance opened this issue Jan 30, 2023 · 4 comments

Comments

@shivance
Copy link
Contributor

shivance commented Jan 30, 2023

Motivation and description

  1. @ToucheSir comment on Adding UNet Model #210 Link
  2. FastAI.jl 's support for ndim parameter in convx layers Link

See :
https://github.com/FluxML/FastAI.jl/blob/505621985c27f0d988086345eb44ce7074611173/FastVision/src/models/xresnet.jl#L4

Possible Implementation

Adding N dim kernels for flux layers (eg conv_norm, basic_conv_bn ...)

Reason : Metalhead's unet adaptation from FastAI.jl uses Metalhead's prebuilt layers like those mentioned above and residual blocks like basicblocks, and these do not have option to specify the number of dimensions of input. They have been developed to keep just 2d images in mind thus 2d kernel.

ndim would enable kernels of multi dimension images.

function conv_norm(kernel_size::Dims{2}, inplanes::Integer, outplanes::Integer,

.. see Dims{2}

@darsnack
Copy link
Member

No need for a separate keyword. We can just remove the 2 from the type. Flux's Conv already understands kernel size tuples that are length other than 2. FastAI's code generates the tuple which is why they accept a keyword.

Most of the work here is removing the type restrictions and adding tests to make sure it works with dimensions other than 2D.

@shivance
Copy link
Contributor Author

Hi @darsnack,
I removed 2 from the type and tested Metalhead against

@testset "UNet" begin
    encoder = Metalhead.backbone(ResNet(18))
    model = UNet((256, 256, 256), 3, 6, encoder)
    x = zeros(256, 256, 256, 1)
    @test size(model(x)) == (256, 256, 256, 6, 1)
end

unit test. It Failed with error

UNet: Error During Test at /home/anshuman/Desktop/Metalhead.jl/test/convnets.jl:1
Got exception outside of a @test
DimensionMismatch: Rank of x and w must match! (5 vs. 4)
Stacktrace:

@darsnack
Copy link
Member

darsnack commented Feb 15, 2023

Did you also update the corresponding UNet code to generate kernel size tuples of length N? It looks like the functions that build the UNet blocks are hardcoded to (3, 3).

@shivance
Copy link
Contributor Author

shivance commented Feb 15, 2023

Yup I tried that as well :)
But still the same error. I even know the reason for that : we are using backbone of metalhead models for encoder.
If you take a look, all of them have hardcoded 2d kernels...(eg. (1, 1)) so the error

@shivance shivance closed this as not planned Won't fix, can't repro, duplicate, stale Jun 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants