You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When converting the TensorFlow Lite (TFLite) format into ONNX using the provided script and comparing the outputs of the TFLite model and ONNX model, discrepancies are observed between them.
Urgency
System information
OS Platform and Distribution (e.g., Linux Ubuntu 18.04*): Linux Ubuntu 22.04
TensorFlow Version: 2.16.1
Python version: 3.10.12
ONNX version (if applicable, e.g. 1.11*): 1.16.0
ONNXRuntime version (if applicable, e.g. 1.11*): 1.16.3
importtensorflowimportonnxruntimeimportnumpyasnpimporttf2onnxfromeinopsimportrearrangeif__name__=="__main__" :
tflite_model_path='poc_float32.tflite'output_onnx_path='./converted.onnx'# Convert TensorFlowLite into ONNXtf2onnx.convert.from_tflite(
tflite_model_path,
opset=18,
output_path=output_onnx_path
)
# Prepare input for TensorFlow modelsinput_np=np.random.randn(1, 3, 224, 224).astype('f')
input_for_tf=rearrange(input_np, 'b c h w -> b h w c')
# load and run onnx modelort_session=onnxruntime.InferenceSession(output_onnx_path)
ort_output=ort_session.run(None, {'x' : input_for_tf})
# Load TensorFlow Lite modelinterpreter=tensorflow.lite.Interpreter(tflite_model_path)
input_details=interpreter.get_input_details()
output_details=interpreter.get_output_details()
interpreter.allocate_tensors()
# Run TensorFlow Lite modelinterpreter.set_tensor(input_details[0]['index'], input_for_tf)
interpreter.invoke()
tflite_output=interpreter.get_tensor(output_details[0]['index'])
# Compare ONNX and TensorFlow Lite outputsifnp.allclose(ort_output[0], tflite_output[0], rtol=1e-03, atol=1e-05):
print("Test Passed: ONNX and TensorFlow Lite outputs match\n")
else:
print("Test Failed: ONNX and TensorFlow Lite outputs differ\n")
Screenshots
poc_float32.tflite and converted ONNX model
The text was updated successfully, but these errors were encountered:
Describe the bug
When converting the TensorFlow Lite (TFLite) format into ONNX using the provided script and comparing the outputs of the TFLite model and ONNX model, discrepancies are observed between them.
Urgency
System information
To Reproduce
poc link: https://compsec.snu.ac.kr/git/SuhwanSong/poc/-/raw/main/tf2onnx/poc_float32.tflite
Screenshots
The text was updated successfully, but these errors were encountered: