-
Notifications
You must be signed in to change notification settings - Fork 353
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. #63
Comments
How did you create your model? Have you gone through the docs for Running TF2 Detection API Models on mobile? |
Hi, Yes, I've created the tflite model as per documentation. import tensorflow as tf
TF_PATH = "/content/tf_model.pb" # where the forzen graph is stored
TFLITE_PATH = "./model.tflite"
# make a converter object from the saved tensorflow file
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(TF_PATH, # TensorFlow freezegraph .pb model file
input_arrays=['input_ids'],
output_arrays=['logits'],
)
converter.experimental_new_converter = True
converter.target_spec.supported_ops = [tf.compat.v1.lite.OpsSet.TFLITE_BUILTINS,
tf.compat.v1.lite.OpsSet.SELECT_TF_OPS]
tf_lite_model = converter.convert()
# Save the model.
with open(TFLITE_PATH, 'wb') as f:
f.write(tf_lite_model) I'm converting an onnx model to tflite Also, the converted tflite model also works when using the tflite interpreter in python. tflite_interpreter = tf.lite.Interpreter(model_path='/content/model.tflite')
tflite_interpreter.allocate_tensors()
input_details = tflite_interpreter.get_input_details()
output_details = tflite_interpreter.get_output_details() I only get this error when using the |
Did you run it through |
Thanks for the link, but from the
Since my model is text generation, not object detection, I wont be able to use the exporter linked. :( |
Closing the issue as it does not seem to be directly related to this plugin. |
@farazk86 did you end up finding a solution around this? I am running into the same error |
No, but I ended up handling all tensorflow lite operations in Java using the flutter platform channel. Using the Java interpreter worked for me. But your error may be from using an incorrect converter. Make sure you know what tf version your main model was trained on. After |
Thanks! I will take a look at that. I am using the prebuilt universal-sentence-encoder-multilingual model and I can't find what version they used to build it other than it was tensorflow 2.0. |
Hi,
I'm getting the following error when initializing interpreter:
Im initializing on cpu:
The text was updated successfully, but these errors were encountered: