-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
flops and parameters question about slim pruner #1846
Comments
Q1: Q2: |
@tanglang96 thanks for your reply! Could you please release the code that can obtain the new constructed compact model? |
@chaos0625 we are working on it, and will release the code soon. |
@tanglang96 @QuanluZhang thanks for your work, I expect that you will release the whole code because it's not completely for now. |
add @Cjkkkk to track this. |
Hello, do you solve the problem? If so, could you please share your code? Thanks! |
@marsggbo not yet |
@marsggbo @chaos0625 we will provide an Alpha version for this feature in v1.4 |
@chaos1992 @marsggbo model speedup has been supported in v1.4, please refer to this doc and this example. Note that it is still in an Alpha version |
nni Environment:
Q1:
After running slim_pruner_torch_vgg19.py (I set epoch to 5 for testting quickly and other config is default), I compare the size of vgg19_cifar10.pth with pruned_vgg19_cifar10.pth. I find their size are both 78321KB and their test time are also nearly. I don't think it's normal and I think pruned_vgg19_cifar10.pth should be smaller and faster than vgg19_cifar10.pth.
How can I use slim_pruner correctly?
Q2:
So I try to use torchstate.stat(model, (3, 32, 32)) in other to get flops and parameters , but it has an error: AttributeError: Can't get attribute 'vgg' on <module 'main' from ...>
As the statement in docs/en_US/Compressor/SlimPruner.md, after slim pruner, the model's parameters is reduced from 20.04M to 2.03M.
How can I get the model's parameters and flops?
Looking forward to your reply, thanks!
The text was updated successfully, but these errors were encountered: