-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google Coral support #1
Comments
it may be worth to try whisper-tiny.en.tflite model on Google Coral Edge TPU |
I think the challenge apart from the size is still that the converted models are not in a format that the compiler accepts, because they contain dynamic-size tensors. |
If I am not missing anything,whisper-tiny.en.tflite model doesnot have dynamic-size tensors. |
Oh, good to know that they differ that way. Thanks! For reference, here's the compiler output for $ edgetpu_compiler whisper-int8.tflite
Edge TPU Compiler version 16.0.384591198
Started a compilation timeout timer of 180 seconds.
ERROR: Attempting to use a delegate that only supports static-sized tensors with a graph that has dynamic-sized tensors.
Compilation failed: Model failed in Tflite interpreter. Please ensure model can be loaded/run in Tflite interpreter.
Compilation child process completed within timeout period.
Compilation failed! Unfortunately, $ edgetpu_compiler whisper-tiny.en.tflite
Edge TPU Compiler version 16.0.384591198
ERROR: Op builtin_code out of range: 150. Are you using old TFLite binary with newer model?
ERROR: Registration failed.
Invalid model: whisper-tiny.en.tflite
Model could not be parsed (I'm using this Colab to quickly try the compiler.) |
maybe some of the op's are not supported by edgetpu_compiler.Maybe you can raise issue as part of edgetpu_compiler. |
@nyadla-sys hi I tested all your tflite models, they may work well in coral and i.mx8m plus hw under cpu delegate.But problem is under NPU/TPU/GPU delegate. would you have any comment? I suspect the problem lies in the input/output type. |
The pre-trained Whisper models don't work out-of-the-box with the Google Coral Edge TPU. They would need to meet certain requirements so they can be converted to TensorFlow Lite, quantized to 8-bit, and compiled.
The
usefulsensors/openai-whisper
repo achieves part of that (e.g.whisper-int8.tflite
), but does not produce compilable models yet (e.g. dynamic tensors need to be converted to static).The text was updated successfully, but these errors were encountered: