-
Notifications
You must be signed in to change notification settings - Fork 6.8k
import_onnx.py parser for onnx opset >= 9 has bug #16590
Comments
@lanking520 assign @oorqueda |
Please assign it to me until I find an owner |
UPDATE: I think mxnet should look at the initializers first. I checked the doc for pytorch export onnx model, and found that I should set Doc for torch.onnx.export
@oorqueda Would you like to fix it? In my case, I use pytorch to export onnx with opset less than 8, it still cannot import the model to mxnet.
Here is the output
Here is error when I import the model.
I even try to add the model parameters to the input names for pytorch to export,
|
I have the same problem with keras2onnx. |
when exporting from torch to onnx, if we add "keep_initializers_as_inputs=True" parameter to export() function, it solves the issue. |
Is there a solution for this in TensorFlow? |
which cause node name not found
The model is fine with onnxruntime
Moreover, opset <= 8 is fine as well
The text was updated successfully, but these errors were encountered: