You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It appears from the documentation that Reshape is compatible
when I run it: sudo n2d2 model.ini -seed 1 -w /dev/null -export CPP -nbbits 8 -db-export 1000 -export-parameters param.ini -calib -1
- model_1/conv2d_13/BiasAdd:0: prev=5476.95, act=66.2797, bias=0.0023597
quant=127, global scaling=28088.2 -> cell scaling=0.00153536
- model_1/concatenate_5/concat:0: prev=28088.2, act=54.2452, bias=0.00901091
quant=1, global scaling=6019.95 -> cell scaling=4.66585
Time elapsed: 24.2798 s
Error: Quantization of cell 'model_1/flatten_1/Reshape:0' of type 'Reshape' is not supported yet.
since there are no weights to quantize, is it possible to do a quantization-ignore layer?
Cheers,
Filippo
The text was updated successfully, but these errors were encountered:
Hi @rattokiller,
I think the Reshape layer was marked as supported for the QAT but not for the PTQ. If you can link a point of the documentation which state otherwise I will try to change it.
I added the reshape cell to the ignore list for the PTQ in the latest commit, this should fix your issue !
Hi @davidbriand-cea , @cmoineau,
It appears from the documentation that Reshape is compatible
when I run it:
sudo n2d2 model.ini -seed 1 -w /dev/null -export CPP -nbbits 8 -db-export 1000 -export-parameters param.ini -calib -1
since there are no weights to quantize, is it possible to do a quantization-ignore layer?
Cheers,
Filippo
The text was updated successfully, but these errors were encountered: