Can i implement a model that learns weights that fit the quantization step size? #50320
Labels
comp:lite
TF Lite related issues
stale
This label marks the issue/pr stale - to be closed automatically if no activity
stat:awaiting response
Status - Awaiting response from author
type:support
Support issues
I want to make model what can train quantization weights. Is there such a way?
I already know how to take out the weights of the trained model,
quantize them in an external tool, and then load them again.
Thanks for reading this question and have a nice day
The text was updated successfully, but these errors were encountered: