Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upstream bug: onnxruntime==1.16.0 requires a parameter meant for GPUs (even though SCT uses the CPU version) #4225

Closed
mguaypaq opened this issue Sep 21, 2023 · 2 comments · Fixed by #4226 or #4256
Milestone

Comments

@mguaypaq
Copy link
Member

See for example this output, taken from this testing run:

FAILED testing/api/test_deepseg_gm.py::TestCore::test_segment_volume - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
FAILED testing/api/test_deepseg_lesion.py::test_segment - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
FAILED testing/api/test_deepseg_sc.py::test_deep_segmentation_spinalcord[params0] - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
FAILED testing/api/test_deepseg_sc.py::test_deep_segmentation_spinalcord[params1] - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
FAILED testing/cli/test_cli_sct_deepseg_gm.py::test_sct_deepseg_gm_check_dice_coefficient_against_groundtruth - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
FAILED testing/cli/test_cli_sct_deepseg_lesion.py::test_sct_deepseg_lesion_no_checks - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
FAILED testing/cli/test_cli_sct_deepseg_sc.py::test_sct_deepseg_sc_check_output_qform_sform - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
FAILED testing/cli/test_cli_sct_deepseg_sc.py::test_sct_deepseg_sc_qc_report_exists - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)

It appeared in the past few days, probably due to a new version of onnxruntime? I'll investigate tomorrow, unless someone beats me to it.

@mguaypaq
Copy link
Member Author

It looks like this tracked upstream: microsoft/onnxruntime#17631. According to the changelog (see under Known Issues) this considered a bug, and a fix is in progress. Until it's released, I'll pin onnxruntime to be <1.16.

mguaypaq added a commit that referenced this issue Sep 22, 2023
mguaypaq added a commit that referenced this issue Sep 22, 2023
@mguaypaq mguaypaq added this to the 6.2 milestone Sep 22, 2023
@mguaypaq
Copy link
Member Author

Let's revisit the version pin when we prepare SCT v6.2.

@joshuacwnewton joshuacwnewton changed the title CI failing on master Upstream bug: onnxruntime==1,16,0 requires a parameter meant for GPUs (even though SCT uses the CPU version) Sep 22, 2023
@joshuacwnewton joshuacwnewton changed the title Upstream bug: onnxruntime==1,16,0 requires a parameter meant for GPUs (even though SCT uses the CPU version) Upstream bug: onnxruntime==1.16.0 requires a parameter meant for GPUs (even though SCT uses the CPU version) Sep 22, 2023
@joshuacwnewton joshuacwnewton linked a pull request Sep 26, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant