Description
Describe the documentation issue
I'm using the latest prebuilt onnx runtime package (1.20.0) for Android from here: https://mvnrepository.com/artifact/com.microsoft.onnxruntime/onnxruntime-android/1.20.0
From that package I'm using the "jni/arm64-v8a/libonnxruntime.so" and the C++ header to run inference on an ONNX model.
This works fine using default session options.
But when I try to select XNNPACK execution provider following the instructions from here: https://onnxruntime.ai/docs/execution-providers/Xnnpack-ExecutionProvider.html I run into a program termination (without any error details) when invoking "session_options.AppendExecutionProvider("XNNPACK", {{"intra_op_num_threads", std::to_string(intra_op_num_threads)}});"
Note that Ort::GetAvailableProviders() tells me that the XNNPACK provider is available:
NnapiExecutionProvider
XnnpackExecutionProvider
CPUExecutionProvider
Setting
session_options.AddConfigEntry(kOrtSessionOptionsConfigAllowIntraOpSpinning, "0");
session_options.SetIntraOpNumThreads(1);
or using different "intra_op_num_threads" does not make a difference.
What am I missing?
How do I determine what the default execution provider is. I assume it's "CPUExecutionProvider"?
Page / URL
https://onnxruntime.ai/docs/execution-providers/Xnnpack-ExecutionProvider.html