-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
convolution layer with cardinality #23
Comments
Removing a single filter in the group convolution in PyTorch will cause misalignment, but it's possible to remove the entire group by using the Group Pruning method proposed in our paper. |
Thanks a lot for your quick reply. I understand that using Group Pruning method allows us to synchronize the number of channel that are pruned within a group in order to avoid misalignment. However, I am still confused how to make sure the final input number of channel in a given Group Pruning is dividable by the number of group (in the convolution) Should I modify g.minimal_filter such as g.minimal_filter=int(g.minimal_filter//conv_groups) ? |
In fact, I am a bit confused how to remove an entire group. |
If you remove a whole group of filters at a time, the number can be kept divisible. The troublesome thing is to maintain the number of channels of the input feature map, since group convolution also divides input. For example, given 6 filters that are divided into 3 groups, and after removing the first group of convolutions, there are 4 filters left and the cardinality should change to 2. The input channel may still be 6, and two corresponding feature maps need to be discarded. So the code to prune a network with cardinality may be very different from what we provided. |
Dear author,
Thank for this impressive piece of work.
How to implement convolution layer with cardinality ?
in universal.py we have:
if the convolution layer have a cardinality (like in several modern model), we get assert False.
How to implement cardinality for convolution layers ?
The text was updated successfully, but these errors were encountered: