Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to get the pruned model without pickle and module dependencies ? #6

Open
planemanner opened this issue Aug 23, 2021 · 0 comments
Open

Comments

@planemanner
Copy link

planemanner commented Aug 23, 2021

First of all, thank for your sharing.

Your experiment codes are perpectly working to reproduce your experiments.

But when I want to get the pruned model which is not trained for my custom experiment,

I get in trouble.

Referring your main script and other codes I found that if I want to load a pruned model in other code base,

I need to call all dependent modules for the neural network with unpickling the pruned weights.

Is there any detour to get a pruned model for using easily as like following ?

import torch
model_path = "Pruned_model.pth"
model = torch.load(model_path)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant