-
Notifications
You must be signed in to change notification settings - Fork 320
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trying to quantise MobileNetv3 small Exception encountered when calling layer "tf.__operators__.add_137" #980
Comments
Same problem with mobilenetlarge: `AttributeError: Exception encountered when calling layer "tf.operators.add_29" (type TFOpLambda). 'list' object has no attribute 'dtype' Call arguments received by layer "tf.operators.add_29" (type TFOpLambda): |
Ok now I am setting minimalist to True and getting this error:
|
Ok this seems to work:
|
Additionally, I have noticed that this also seems to work for MobileNetV3Small even though in the article earlier it only states MobileNetV3Large |
Ok now I run this code:
and get this error:
for both int8 and uint8. I have ran it without full int 8 quantisation and it works. But I need full int 8!!! |
Hi the model input shapes are (224, 224, 3), so you can try size=(1, 224, 224, 3) on representative_dataset function. Thanks. |
There has recently been an article saying mobilenetv3 can be used for QAT
https://blog.tensorflow.org/2022/06/Adding-Quantization-aware-Training-and-Pruning-to-the-TensorFlow-Model-Garden.html
However when I run this code:
I get this error:
I am using an m1 Mac:
conda
Python 3.10.4
tensorflow-macos 2.9.2
tensorflow-metal 0.5.0
tensorflow-model-optimization 0.7.2
The text was updated successfully, but these errors were encountered: