Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Temporarily pin onnxruntime to <1.16 #4226

Merged
merged 1 commit into from
Sep 22, 2023
Merged

Temporarily pin onnxruntime to <1.16 #4226

merged 1 commit into from
Sep 22, 2023

Conversation

mguaypaq
Copy link
Member

Due to microsoft/onnxruntime#17631, version 1.16.0 is causing CI to fail on master. We can change the pin to !=1.16.0 (or to however many versions are buggy) once upstream releases a fix.

Part of #4225.

@mguaypaq mguaypaq added bug category: fixes an error in the code upstream Issue caused by software dependencies labels Sep 22, 2023
@mguaypaq mguaypaq added this to the 6.1 milestone Sep 22, 2023
@joshuacwnewton
Copy link
Member

Reading through the underlying onnxruntime issue, it seems like this may also be easily fixed by specifying the CPU provider to the providers argument:

The workaround is to explicitly supply the CPUExecutionProvider in the list.

Since onnxruntime (the cpu version) is what we currently use, as opposed to onnxruntime-gpu, I think it should be fine to specify this explicitly?

In deepseg_/onnx.py:

-    ort_sess = ort.InferenceSession(model_path, sess_options=sess_options)
+    ort_sess = ort.InferenceSession(model_path, sess_options=sess_options, providers=['CPUExecutionProvider'])

@mguaypaq
Copy link
Member Author

This is what's suggested by the error message, but I kind of agree with the explanation under Describe the issue that this may be misleading. The docs show the Azure provider as preview-only (link), and show a "Default CPU" provider, and onnxruntime.InferenceSession shows providers as being an optional argument (link), so I don't think it's meant to be mandatory.

@mguaypaq
Copy link
Member Author

But ultimately, I'm fine with either fix, up to you.

@joshuacwnewton
Copy link
Member

I think my thinking here was that we might want to keep the versions as open as possible? But, that said, I have no guarantee whether the providers fix is backwards-compatible with older verisons. So, on second thought, I think I like your fix a little more -- we can always go back and specify onnxruntime!=1.16.0 once the patch comes out.

@mguaypaq
Copy link
Member Author

Looking at the changelog for ORT 1.9, which is mentioned in the error message. Looking at the first point under Packages, it looks like this was meant for the GPU packages only, not for the basic CPU package.

@mguaypaq mguaypaq merged commit 792a155 into master Sep 22, 2023
24 checks passed
@mguaypaq mguaypaq deleted the mgp/pin-onnxruntime branch September 22, 2023 15:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug category: fixes an error in the code upstream Issue caused by software dependencies
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Upstream bug: onnxruntime==1.16.0 requires a parameter meant for GPUs (even though SCT uses the CPU version)
2 participants