Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add group_fisher pruning algorithm to prune a detection model #410

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

wilxy
Copy link
Contributor

@wilxy wilxy commented Dec 27, 2022

Motivation

Add the pruning algorithm Group Fisher Pruning for Practical Network Compression as a demo to prune the detection model (like RetinaNet in this PR).

Modification

  1. Add GroupFisher algorithm.
  2. Add its related configs and README.
  3. Add its related unittest.

Copy link
Collaborator

@pppppM pppppM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wilxy @LKJacky During the development of GroupFisher, some MMRazor's problems were exposed.

Problems

  1. L1MutableChannelUnit inherits from SequentialMutableChannelUnit, but L1MutableChannelUnit is not sequential.
  2. Are Mutator and ChannelUnit necessary ? I can do GroupFisher just with GroupFisherAlgorithm and L1MutableChannelUnit. https://github.com/pppppM/mmrazor/tree/group_fisher
  3. For pruning algorithms, the boundaries between the Algorithm, ChannelMutator, and ChannelUnit are not clear enough. Even though I am the developer of MMRazor, I can't find a reasonable way to divide GroupFisher into Algorithm, ChannelMutator, and ChannelUnit

Suggestions

I think the above problems are difficult to solve completely without refactoring, especially 2 and 3. There are some suggestions about refactoring.

  • For the boundaries between Mutator and ChannelUnit, I suggest that we can refer to NNI and concentrate the logic of pruning in Mutator instead of Mutator & ChannelUnit. ChannelUnit only serves as the most basic data structure.
    In this way, when developers develop new pruning algorithms, they will know more clearly that they need to develop a new XXChannelMutator based on ChannelUnit.
  • For the boundaries between Mutator and Algorithm, Mutator should contain all the basic APIs of a pruning algorithm, and Algorithm can implement the complete pruning algorithm through these APIs.
    The logic of the algorithm should be implemented in Mutator as much as possible, and Algorithm is just some logic for adapting to OpenMMLab so that MMRazor can be quickly used in other codebases.

Temporary Solution

Considering the reality, MMRazor needs to have some robust pruning algorithms as soon as possible, and the above suggestions cannot be realized in the short term. I suggest that the GroupFisher can be released in the form of a project(https://github.com/open-mmlab/mmocr/tree/1.x/projects). After the above problems are completely solved, it will be merged into MMRazor

@LKJacky
Copy link
Collaborator

LKJacky commented Jan 10, 2023

@pppppM I provide three answers for the proposed problems above.

  1. It's indeed a problem. I am considering refactoring L1MutableChannelUnit and SequentialMutableChannelUnit.
  2. We can implement a pruning algorithm without Mutator and MutableChannelUnit. However, dividing Mutator and MutableChannelUnit can make codes more readable, where Mutator controls the pruning structure, and MutableChannelUnit defines the core pruning logic.
  3. I think the module structure is reasonable. We may need more documents to explain the division.
Module Feature
Algorithm When to prune
Mutator Which layer to prune
Unit How to compute channel importance
DyanmicOp Collect some info about the model, like feature map, grad

Now, we refactor group fisher by dividing it to four parts in the folder.

Module Feature
Algorithm prune iteratively according to iterations
Mutator prune a channel with min fisher info each time
Unit compute fisher info for a unit
DyanmicOp collect input and grad of input

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants