Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue about default HAWQ #13

Closed
wowow11111 opened this issue Apr 15, 2021 · 2 comments
Closed

Issue about default HAWQ #13

wowow11111 opened this issue Apr 15, 2021 · 2 comments

Comments

@wowow11111
Copy link

wowow11111 commented Apr 15, 2021

Hi, I've been working on running HAWQ based on my machine and now I finally could run the 'test_resnet_inference_time.py' file completely.
Thus, I'm now working on a given zoo model and run it on gpu following your git explanation. (At last, want to run HAWQ on VTA, TVM based NPU)

I re-downloaded from baseline and followed the steps you gave and am facing few questions.
First of all, except for the 'resnet18_uniform8', your models downloadable from model zoo does not contain 'quantized_checkpoint.pth.tar' file but only 'checkpoint.pth.tar' file, which leads to error [No such file or directory error].
But 'hawq_utils_resnet50.py' is hard coded based on resnet50.

So, What is the difference between checkpoint and quantized_checkpoint?
Is it just okay to change from quantized_checkpoint to checkpoint in 'hawq_utils_resnet50.py' file?

If I do, then the former error(the dict_key error) occurs. How do I change the parameters as "3. change PyTorch parameters to TVM format" for the ones that only contain checkpoint.pth.tar file?

@wowow11111
Copy link
Author

Apparently, quantized_checkpoint file, which only 'resnet18_uniform8' contains, has the corresponding dict_key and the rest models which only contains checkpoint file makes errors. But still the hawq_utils_resnet python file is hard coded for resnet50. What is needed to run other models like resnet50 uniform4 or mixed precision models?

@wowow11111
Copy link
Author

Ok, I just manually made to just pull out the data needed from checkpoint.pth.tar file and manually make quantized_checkpoint.pth.tar file.
Just loaded the checkpoint file and made a new dictionary with the keys that 'quantized_checkpoint' has.
I think I can close this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant