Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reshape Quantization not supported? #119

Closed
rattokiller opened this issue Sep 20, 2022 · 1 comment
Closed

Reshape Quantization not supported? #119

rattokiller opened this issue Sep 20, 2022 · 1 comment

Comments

@rattokiller
Copy link

rattokiller commented Sep 20, 2022

Hi @davidbriand-cea , @cmoineau,

It appears from the documentation that Reshape is compatible

when I run it: sudo n2d2 model.ini -seed 1 -w /dev/null -export CPP -nbbits 8 -db-export 1000 -export-parameters param.ini -calib -1

  - model_1/conv2d_13/BiasAdd:0: prev=5476.95, act=66.2797, bias=0.0023597
      quant=127, global scaling=28088.2 -> cell scaling=0.00153536
  - model_1/concatenate_5/concat:0: prev=28088.2, act=54.2452, bias=0.00901091
      quant=1, global scaling=6019.95 -> cell scaling=4.66585
Time elapsed: 24.2798 s
Error: Quantization of cell 'model_1/flatten_1/Reshape:0' of type 'Reshape' is not supported yet.

since there are no weights to quantize, is it possible to do a quantization-ignore layer?

Cheers,
Filippo

@cmoineau
Copy link
Collaborator

cmoineau commented Oct 7, 2022

Hi @rattokiller,
I think the Reshape layer was marked as supported for the QAT but not for the PTQ. If you can link a point of the documentation which state otherwise I will try to change it.

I added the reshape cell to the ignore list for the PTQ in the latest commit, this should fix your issue !

Cheers,
Cyril

@cmoineau cmoineau closed this as completed Nov 2, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants