TFLite GPU crashes when not all ops are supported by delegate? #25950
Labels
comp:lite
TF Lite related issues
stat:awaiting response
Status - Awaiting response from author
type:support
Support issues
System information
Describe the current behavior
Using TF Lite with GPU developer preview, when I run inference on my mobilenetV2 retrained model (with two outputs) on most devices it works nice. However on Galaxy J5 (odler device), it crashes with:
Describe the expected behavior
I would expect that when isGpuDelegateAvailable() method provides only such a delegate, that is capable preparing and running all the ops. It should return false otherwise, I guess. Is there some way how to tell this in advance, or should I manually check for these kind of internal errors and try to fall back by recreating Interpret without GPU support?
Code to reproduce the issue
` protected void runInference() {
Other info / logs
Thank you guys for your effort to make this available to us, you are doing very good job!
The text was updated successfully, but these errors were encountered: