Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. #63

Closed
farazk86 opened this issue Jan 5, 2021 · 8 comments

Comments

@farazk86
Copy link

farazk86 commented Jan 5, 2021

Hi,

I'm getting the following error when initializing interpreter:

I/tflite  ( 6266): Initialized TensorFlow Lite runtime.
E/tflite  ( 6266): Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
E/tflite  ( 6266): Node number 0 (FlexPlaceholder) failed to prepare.
E/flutter ( 6266): [ERROR:flutter/lib/ui/ui_dart_state.cc(177)] Unhandled Exception: Bad state: failed precondition
E/flutter ( 6266): #0      checkState (package:quiver/check.dart:73:5)
E/flutter ( 6266): #1      Interpreter.allocateTensors (package:tflite_flutter/src/interpreter.dart:150:5)
E/flutter ( 6266): #2      new Interpreter._ (package:tflite_flutter/src/interpreter.dart:31:5)
E/flutter ( 6266): #3      new Interpreter._create (package:tflite_flutter/src/interpreter.dart:42:24)
E/flutter ( 6266): #4      new Interpreter.fromBuffer (package:tflite_flutter/src/interpreter.dart:91:37)
E/flutter ( 6266): #5      Interpreter.fromAsset (package:tflite_flutter/src/interpreter.dart:114:24)
E/flutter ( 6266): <asynchronous suspension>
E/flutter ( 6266): #6      _MyHomePageState.loadModel (package:text_gen_gpu/main.dart:321:20)
E/flutter ( 6266): <asynchronous suspension>
E/flutter ( 6266): #7      _MyHomePageState.init (package:text_gen_gpu/main.dart:299:5)
E/flutter ( 6266): <asynchronous suspension>
E/flutter ( 6266): #8      _MyHomePageState.initState.<anonymous closure> (package:text_gen_gpu/main.dart)
E/flutter ( 6266): <asynchronous suspension>
E/flutter ( 6266): 

Im initializing on cpu:

var interpreterOptions = InterpreterOptions()..threads = NUM_LITE_THREADS;
    _interpreter = await Interpreter.fromAsset(
      modelFile,
      options: interpreterOptions,
    );
@mgalgs
Copy link
Contributor

mgalgs commented Jan 8, 2021

How did you create your model? Have you gone through the docs for Running TF2 Detection API Models on mobile?

@farazk86
Copy link
Author

farazk86 commented Jan 8, 2021

How did you create your model? Have you gone through the docs for Running TF2 Detection API Models on mobile?

Hi,

Yes, I've created the tflite model as per documentation.

import tensorflow as tf

TF_PATH = "/content/tf_model.pb" # where the forzen graph is stored
TFLITE_PATH = "./model.tflite"

# make a converter object from the saved tensorflow file
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(TF_PATH,  # TensorFlow freezegraph .pb model file
                                                      input_arrays=['input_ids'], 
                                                      output_arrays=['logits'],
                                                      )

converter.experimental_new_converter = True

converter.target_spec.supported_ops = [tf.compat.v1.lite.OpsSet.TFLITE_BUILTINS,
                                       tf.compat.v1.lite.OpsSet.SELECT_TF_OPS]

tf_lite_model = converter.convert()
# Save the model.
with open(TFLITE_PATH, 'wb') as f:
    f.write(tf_lite_model)

I'm converting an onnx model to tflite (onnx -> tensorflow -> tflite). The onnx model is working as expected when using the onnx interpreter onnxruntime.

Also, the converted tflite model also works when using the tflite interpreter in python.

tflite_interpreter = tf.lite.Interpreter(model_path='/content/model.tflite')
tflite_interpreter.allocate_tensors()

input_details = tflite_interpreter.get_input_details()
output_details = tflite_interpreter.get_output_details()

I only get this error when using the tflite_flutter interpreter. :(

@mgalgs
Copy link
Contributor

mgalgs commented Jan 8, 2021

Did you run it through export_tflite_graph_tf2.py as well? I'm pretty sure this was the exact error message I was seeing before I started using that guy. I was doing the same thing as you, just using the tflite Python API to convert my model to tflite, but it needs to go through export_tflite_graph_tf2.py before you do that. Check out my earlier issue for more details:

#59 (comment)

@farazk86
Copy link
Author

farazk86 commented Jan 9, 2021

Did you run it through export_tflite_graph_tf2.py as well? I'm pretty sure this was the exact error message I was seeing before I started using that guy. I was doing the same thing as you, just using the tflite Python API to convert my model to tflite, but it needs to go through export_tflite_graph_tf2.py before you do that. Check out my earlier issue for more details:

#59 (comment)

Thanks for the link, but from the export_tflite_graph_tf2.py:

NOTE: This only supports SSD meta-architectures for now.

Since my model is text generation, not object detection, I wont be able to use the exporter linked. :(

@am15h
Copy link
Owner

am15h commented Jan 14, 2021

Closing the issue as it does not seem to be directly related to this plugin.

@am15h am15h closed this as completed Jan 14, 2021
@anovis
Copy link

anovis commented Apr 15, 2021

Did you run it through export_tflite_graph_tf2.py as well? I'm pretty sure this was the exact error message I was seeing before I started using that guy. I was doing the same thing as you, just using the tflite Python API to convert my model to tflite, but it needs to go through export_tflite_graph_tf2.py before you do that. Check out my earlier issue for more details:
#59 (comment)

Thanks for the link, but from the export_tflite_graph_tf2.py:

NOTE: This only supports SSD meta-architectures for now.

Since my model is text generation, not object detection, I wont be able to use the exporter linked. :(

@farazk86 did you end up finding a solution around this? I am running into the same error

@farazk86
Copy link
Author

Did you run it through export_tflite_graph_tf2.py as well? I'm pretty sure this was the exact error message I was seeing before I started using that guy. I was doing the same thing as you, just using the tflite Python API to convert my model to tflite, but it needs to go through export_tflite_graph_tf2.py before you do that. Check out my earlier issue for more details:
#59 (comment)

Thanks for the link, but from the export_tflite_graph_tf2.py:

NOTE: This only supports SSD meta-architectures for now.

Since my model is text generation, not object detection, I wont be able to use the exporter linked. :(

@farazk86 did you end up finding a solution around this? I am running into the same error

No, but I ended up handling all tensorflow lite operations in Java using the flutter platform channel. Using the Java interpreter worked for me. But your error may be from using an incorrect converter. Make sure you know what tf version your main model was trained on. After tf==2.1 or tf==2.2 (dont exactly remember right now,),they changed how we convert models to tflite.

@anovis
Copy link

anovis commented Apr 16, 2021

Thanks! I will take a look at that. I am using the prebuilt universal-sentence-encoder-multilingual model and I can't find what version they used to build it other than it was tensorflow 2.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants