Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems encountered while compiling the onnx model #126

Closed
Zeus1116 opened this issue Dec 21, 2023 · 4 comments · Fixed by #127
Closed

Problems encountered while compiling the onnx model #126

Zeus1116 opened this issue Dec 21, 2023 · 4 comments · Fixed by #127
Labels
help wanted Extra attention is needed

Comments

@Zeus1116
Copy link

I created an nnsmith environment using Conda and installed the required libraries according to the installation tutorial "python3- m pip install" nnsmith [torch, onnx, tvm, onnxruntime] "-- upgrade" in cli.md. I can generate the onnx model normally, but I encountered an issue when trying to debug this model locally.
When I used the command "nnsmith.model_exec model.type=onnx backend.type=onnxruntime model.path=nnsmith_output/model.onnx", an AttributeError has occurred
module 'onnx' has no attribute '_serialize'
image

This seems to be an issue with the onnx version, so I checked the onnx version and found that the current onnx version is 1.15.0, then I reinstalled version 1.14.0 of onnx, then I run the same command again, the onnx model will compile and run smoothly.
image

@Zeus1116
Copy link
Author

image
I entered the tvm environment to reinstall the required configuration for nnsmith and ran the command "nnsmith. model_exec model. type=onnx"\

Backend. type=onnxruntime\

Model. path=nnsmith_ Output/model.onnx\

Cmp. with='{type: tvm, optmax: true, target: CPU}' 'conducted differential testing, but encountered an error as shown in the above figure. Can someone help me identify the cause.

@ganler
Copy link
Member

ganler commented Dec 21, 2023

Thanks for reporting the issue. Looks like you are right, onnx updated their API in 1.15.0 to load_from_string. Please bear a bit by downgrading the onnx versions for now before I brought up a fix.

@ganler
Copy link
Member

ganler commented Dec 21, 2023

image I entered the tvm environment to reinstall the required configuration for nnsmith and ran the command "nnsmith. model_exec model. type=onnx"\

Backend. type=onnxruntime\

Model. path=nnsmith_ Output/model.onnx\

Cmp. with='{type: tvm, optmax: true, target: CPU}' 'conducted differential testing, but encountered an error as shown in the above figure. Can someone help me identify the cause.

Well, this is a TVM problem and nnsmith cannot help here.

More specifically, the pre-built TVM binary is compiled using a GLibC version which is newer than what your OS has right now. You can either recompile TVM locally to use your local glibc or install a newer glibc in conda.

@ganler ganler added the help wanted Extra attention is needed label Dec 21, 2023
@Zeus1116
Copy link
Author

image I entered the tvm environment to reinstall the required configuration for nnsmith and ran the command "nnsmith. model_exec model. type=onnx"
Backend. type=onnxruntime
Model. path=nnsmith_ Output/model.onnx
Cmp. with='{type: tvm, optmax: true, target: CPU}' 'conducted differential testing, but encountered an error as shown in the above figure. Can someone help me identify the cause.

Well, this is a TVM problem and nnsmith cannot help here.

More specifically, the pre-built TVM binary is compiled using a GLibC version which is newer than what your OS has right now. You can either recompile TVM locally to use your local glibc or install a newer glibc in conda.

I understand. Thanks for your help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants