Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When you call python train.py --deploy --eval <-other-commands-> the network is pruned at runtime (by the if statement on line 73) and then used. #3

Closed
cvJie opened this issue Oct 15, 2019 · 0 comments

Comments

@cvJie
Copy link

cvJie commented Oct 15, 2019

When you call python train.py --deploy --eval <-other-commands-> the network is pruned at runtime (by the if statement on line 73) and then used.

If you would like to save the pruned model, you could put a:

save_checkpoint({
                'epoch': SD['epoch'],
                'state_dict': model.state_dict(),
                'error_history': SD['error_history'],
            }, filename=filename)

after the call to pruner.compress(model) on line 89.

However, it is better to leave the saved checkpoint in its original state, with the masks attached, to make it easier to recover the pruning trajectories. You can check the number of ops/params with get_inf_params(model) if you want an indication of compression rate. Whenever you want to use a pruned model for something, you then just need to run the code from lines 71 to 89 in train.py, followed by the task you want the pruned model to complete.

Hope this helps.

Originally posted by @jack-willturner in #1 (comment)

@cvJie cvJie closed this as completed Oct 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant