You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a model of segmentation, Bisenet-V2.
Need to convert to FP16 model, first convert a FP32 model successfully, and predict successfully too.
Then use float16_converter of onnxmltools to convert a FP16 model.
But when run the prediction, there is a error: onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from model_fp16.onnx failed:Node (Resize__846) Op (Resize) [ShapeInferenceError] Either sizes or scales must be provided, but not both of them
Is the problem about function tf.image.resize_bilinear?
How to solve it ?
Describe the bug
I have a model of segmentation, Bisenet-V2.
Need to convert to FP16 model, first convert a FP32 model successfully, and predict successfully too.
Then use
float16_converter
ofonnxmltools
to convert a FP16 model.But when run the prediction, there is a error:
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from model_fp16.onnx failed:Node (Resize__846) Op (Resize) [ShapeInferenceError] Either sizes or scales must be provided, but not both of them
Is the problem about function
tf.image.resize_bilinear
?How to solve it ?
Try to find some issue with same problem,
FP16 conversion yields an unusable model,
FP16 conversion yields an unusable model
support sizes for Resize op
Urgency
Urgent
System information
To Reproduce
Screenshots
Additional context
The text was updated successfully, but these errors were encountered: