Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to Get a pretrained model to do PTQ? #9

Closed
youdutaidi opened this issue May 18, 2022 · 5 comments
Closed

How to Get a pretrained model to do PTQ? #9

youdutaidi opened this issue May 18, 2022 · 5 comments

Comments

@youdutaidi
Copy link

youdutaidi commented May 18, 2022

Thank you for sharing your PTQ code but I test Deit-Small with min-max quant technic, top@1 accuracy in validation dataset is only 0.11, can not match the papers 75.05%,is there anything wrong?

@linyang-zhh
Copy link
Collaborator

linyang-zhh commented May 18, 2022

Hi! Our models are loaded with the official open source checkpoints, and it can be found in link1 and link2. You can check whether the pretrained models are loaded correctly.

@youdutaidi
Copy link
Author

youdutaidi commented May 24, 2022

Hi! Our models are loaded with the official open source checkpoints, and it can be found in link1 and link2. You can check whether the pretrained models are loaded correctly.

Yes, I loaded the pre-trained model in link and test it ,but the validation result performs poorly, I print the pred and target lable (where the original code is :

accuracy(output, target, topk=(1,)):
    maxk = max(topk)
    batch_size = target.size(0)
    _, pred = output.topk(maxk, 1, True, True)
    pred = pred.t()
    print("pred: ",pred)
    print("target: ",target.reshape(1, -1).expand_as(pred))
    correct = pred.eq(target.reshape(1, -1).expand_as(pred))
    res = []
    for k in topk:
        correct_k = correct[:k].reshape(-1).float().sum(0)
        res.append(correct_k.mul_(100.0 / batch_size))
    return res

And the printed results are like this :

pred: tensor([[171, 819, 672, 30, 688, 549, 549, 591, 591, 145, 591, 622, 591, 745, 688, 557, 688, 591, 978, 591, 492, 591, 492, 622, 549, 549, 688, 591, 492, 492, 622, 688, 688, 688, 591, 492, 152, 171, 557, 688, 145, 616, 978, 978, 591, 591, 591, 591, 492, 622, 591, 622, 978, 745, 492, 591, 591, 492, 591, 336, 622, 846, 492, 740, 336, 492, 745, 492, 336, 688, 688, 30, 928, 978, 30, 624, 152, 591, 727, 622, 688, 549, 692, 492, 928, 622, 624, 492, 549, 756, 492, 622, 853, 688, 492, 591, 156, 727, 745, 688], [622, 150, 591, 688, 907, 941, 745, 152, 740, 557, 740, 171, 740, 492, 513, 745, 591, 941, 427, 549, 169, 152, 745, 171, 745, 492, 672, 688, 622, 622, 492, 740, 740, 740, 740, 622, 591, 622, 571, 591, 622, 614, 853, 744, 672, 798, 152, 783, 622, 492, 549, 145, 728, 370, 145, 152, 152, 745, 744, 624, 492, 549, 549, 591, 624, 745, 549, 622, 624, 30, 907, 688, 688, 819, 688, 169, 331, 30, 624, 492, 30, 591, 744, 923, 911, 492, 336, 557, 745, 273, 622, 171, 978, 692, 273, 740, 461, 336, 61, 740], [196, 336, 30, 591, 928, 798, 657, 549, 672, 745, 692, 492, 672, 663, 672, 492, 740, 978, 672, 152, 387, 928, 549, 196, 657, 745, 30, 740, 923, 727, 145, 309, 591, 30, 783, 145, 692, 492, 745, 740, 492, 269, 941, 819, 740, 152, 549, 346, 923, 145, 740, 196, 672, 61, 336, 740, 187, 557, 672, 387, 557, 609, 387, 688, 169, 557, 464, 923, 907, 740, 882, 513, 911, 624, 513, 196, 675, 740, 614, 196, 740, 798, 549, 387, 291, 557, 907, 571, 387, 111, 557, 530, 728, 907, 418, 152, 170, 492, 370, 455], [492, 978, 874, 513, 740, 591, 492, 798, 337, 492, 152, 196, 688, 464, 874, 418, 30, 187, 707, 740, 846, 549, 387, 492, 387, 387, 874, 152, 145, 196, 196, 672, 30, 591, 152, 196, 662, 145, 145, 672, 557, 203, 591, 218, 337, 928, 928, 672, 846, 557, 152, 171, 688, 273, 557, 928, 692, 61, 218, 853, 196, 387, 27, 30, 907, 528, 663, 452, 513, 591, 797, 907, 246, 150, 591, 259, 591, 688, 702, 145, 874, 152, 381, 622, 688, 145, 513, 622, 492, 928, 418, 203, 688, 152, 448, 549, 530, 923, 273, 337], [978, 387, 978, 874, 591, 745, 692, 740, 30, 571, 898, 978, 152, 549, 728, 622, 289, 629, 153, 978, 549, 740, 923, 145, 635, 635, 337, 30, 387, 495, 495, 106, 672, 672, 346, 171, 675, 196, 492, 30, 875, 336, 549, 672, 30, 549, 798, 619, 387, 196, 783, 492, 513, 657, 622, 688, 740, 273, 819, 367, 145, 306, 325, 672, 622, 61, 61, 196, 853, 672, 600, 728, 907, 387, 129, 609, 372, 672, 907, 557, 513, 692, 306, 336, 692, 196, 387, 145, 635, 370, 923, 978, 30, 591, 370, 692, 962, 387, 464, 106]], device='cuda:0') target: tensor([[14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15], [14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15], [14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15], [14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15], [14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15]], device='cuda:0')

@linyang-zhh
Copy link
Collaborator

@youdutaidi
Hi, sorry for that, and can u provide your command and the code (if the code were modified). I will check it.

@youdutaidi
Copy link
Author

youdutaidi commented May 25, 2022

@youdutaidi Hi, sorry for that, and can u provide your command and the code (if the code were modified). I will check it.

Thank you very much for your friendly help.
I didn't modify the original code ,and the command is :

CUDA_VISIBLE_DEVICES=1 python test_quant.py deit_small /home/dataset/cailingling/ImageNet/ --quant --ptf --lis --quant-method minmax

result is :
image

@PeiqinSun
Copy link
Member

Please first verify your dataset and float model is right?

  1. You can use command:
    CUDA_VISIBLE_DEVICES=1 python test_quant.py deit_small /home/dataset/cailingling/ImageNet/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants