-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Selecting XNNPACK as execution provider for Android following the documentation example results in program termination #23826
Comments
It seems the XnnPack backend is sensitive to a specific layer or the layout of the model. This may indicate that the the failure happens in the "GraphTransformer Level2_RuleBasedTransformer" pass Please find logcat dump below for the failing onnx model inference below 02-26 15:12:55.442 21887 21887 I servicemanager: Caller(pid=19930,uid=1068,sid=u:r:secure_element:s0:c44,c260,c512,c768) Tried to start aidl service android.hardware.secure_element.ISecureElement/eSE1 as a lazy service, but was unable to. Usually this happens when a service is not installed, but if the service is intended to be used as a lazy service, then it may be configured incorrectly. |
turns out the issue is that Xnnpack backend cannot handle this specific onnx model parameters for the Resize operation. |
Describe the documentation issue
I'm using the latest prebuilt onnx runtime package (1.20.0) for Android from here: https://mvnrepository.com/artifact/com.microsoft.onnxruntime/onnxruntime-android/1.20.0
From that package I'm using the "jni/arm64-v8a/libonnxruntime.so" and the C++ header to run inference on an ONNX model.
This works fine using default session options.
But when I try to select XNNPACK execution provider following the instructions from here: https://onnxruntime.ai/docs/execution-providers/Xnnpack-ExecutionProvider.html I run into a program termination (without any error details) when invoking "session_options.AppendExecutionProvider("XNNPACK", {{"intra_op_num_threads", std::to_string(intra_op_num_threads)}});"
Note that Ort::GetAvailableProviders() tells me that the XNNPACK provider is available:
NnapiExecutionProvider
XnnpackExecutionProvider
CPUExecutionProvider
Setting
session_options.AddConfigEntry(kOrtSessionOptionsConfigAllowIntraOpSpinning, "0");
session_options.SetIntraOpNumThreads(1);
or using different "intra_op_num_threads" does not make a difference.
What am I missing?
How do I determine what the default execution provider is. I assume it's "CPUExecutionProvider"?
Page / URL
https://onnxruntime.ai/docs/execution-providers/Xnnpack-ExecutionProvider.html
The text was updated successfully, but these errors were encountered: