Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Groups added to Conv2d #100

Open
Aavache opened this issue Dec 9, 2023 · 11 comments
Open

[Feature Request] Groups added to Conv2d #100

Aavache opened this issue Dec 9, 2023 · 11 comments
Labels
enhancement New feature or request

Comments

@Aavache
Copy link

Aavache commented Dec 9, 2023

πŸš€ What is the purpose of this issue?

According to the documentation, these are the parameters available for Conv2d.

  • in_channels (int) – The number of input channels.
  • out_channels (int) – The number of output channels.
  • kernel_size (int or tuple) – The size of the convolution filters.
  • stride (int or tuple, optional) – The size of the stride when applying the filter. Default: 0.
  • padding (int or tuple, optional) – How many positions to 0-pad the input with. Default: 0.
  • bias (bool, optional) – If True add a learnable bias to the output. Default: True

However, the dilation and groups parameter as defined in the torch implementation of Conv2d, would be desirable to implement well-known architectures, e.g. ASPP. The following is the description of these two parameters according to torch:

  • dilation controls the spacing between the kernel points; also known as the Γ  trous algorithm. It is harder to describe, but this link has a nice visualization of what dilation does.

  • groups control the connections between inputs and outputs. in_channels and out_channels must both be divisible by groups. For example, at groups=1, all inputs are convolved to all outputs. At groups=2, the operation becomes equivalent to having two conv layers side by side, each seeing half the input channels and producing half the output channels, and both subsequently concatenated.

Thank you for your attention πŸ‘

@Aavache Aavache changed the title [Feature Request] Dilation and Groups Arguments could be added to Conv2d [Feature Request] Dilation and Groups arguments could be added to Conv2d Dec 9, 2023
@awni awni added the enhancement New feature or request label Dec 9, 2023
@dc-dc-dc
Copy link
Contributor

@jagrit06 in conv2d op call, there is an assert to check if dilations is set to 1, but from a quick look through the implementation it seems to support dilations. I removed this check and messed with it a little comparing to pytorch and am seeing the same results with different values for dilations. Is there a specific case where dilations isn't supported?

@jagrit06
Copy link
Member

Well at the moment the convolutions with dilations revert to a fairly naive and slow kernel on the GPU (though I think we do fine on the CPU)

That restriction was added to basically side step that performance hit

@dc-dc-dc
Copy link
Contributor

dc-dc-dc commented Feb 1, 2024

Should the assert be removed and create a separate issue to improve Convolution speed on GPU when dilation is not 1? Creates more visibility to the core issue and opens the possibility of a contributor able to help out

@a1eaiactaest
Copy link
Contributor

agree with @dc-dc-dc, this would be really helpful for implementations that require dilation

@jimexist
Copy link

jimexist commented Feb 3, 2024

FWIW this PR now also depends on a groups param present.

@a1eaiactaest
Copy link
Contributor

Added dilation in #766

@awni
Copy link
Member

awni commented Mar 19, 2024

Is anyone working on groups for convolution? E.g. @Stealeristaken ?

I am closing #637 as it's inactive but it might be useful as a starting point.

@awni awni changed the title [Feature Request] Dilation and Groups arguments could be added to Conv2d [Feature Request] Groups added to Conv2d Mar 19, 2024
@Stealeristaken
Copy link
Contributor

Is anyone working on groups for convolution? E.g. @Stealeristaken ?

I am closing #637 as it's inactive but it might be useful as a starting point.

Been a bit busy these days. I tried adding groups to convs, but it didn't work very well. So, I closed that pull request. I'll try again in the next few days.

@Rifur13
Copy link
Contributor

Rifur13 commented Mar 28, 2024

I'm eager to get this working as well. I can pick this up if no one is working on it.

@awni
Copy link
Member

awni commented Mar 28, 2024

I don't think anyone is actively working on it, would be great to have you work on it @Rifur13 !

@Stealeristaken
Copy link
Contributor

I'm eager to get this working as well. I can pick this up if no one is working on it.

feel free to take. gl!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

8 participants