Open
Description
Describe the issue
I see the 1.22.0 onnxruntime release note said this version support QNN GPU for Windows ARM64.
I get the 1.22.0 pre-built package from https://www.nuget.org/packages/Microsoft.ML.OnnxRuntime.QNN#readme-body-tab
And try to use QNN GPU running a samll model.
But I get the error when use Ort::Session(...)
create network from model buffer.
--- image_classification, device:GPU
ONNXRuntime Version: 1.22.0
ONNXRuntime supported device on this machine
AUTO CPU GPU
Current device is GPU
backend:C:\Users\ming\work\qnn-onnxruntime-test\build\Release\QnnGpu.dll
2025-06-20 22:06:20.2858459 [W:onnxruntime:, qnn_backend_manager.cc:976 onnxruntime::qnn::QnnBackendManager::SetupBackend] Failed to setup so cleaning up
2025-06-20 22:06:20.2924383 [E:onnxruntime:, qnn_execution_provider.cc:767 onnxruntime::QNNExecutionProvider::GetCapability] QNN SetupBackend failed qnn_backend_manager.cc:426 onnxruntime::qnn::QnnBackendManager::InitializeBackend Failed to initialize backend. Error: QNN_COMMON_ERROR_PLATFORM_NOT_SUPPORTED: Attempt to use QNN API on an unsupported platform
2025-06-20 22:06:20.3102722 [E:onnxruntime:, qnn_backend_manager.cc:1244 onnxruntime::qnn::QnnBackendManager::ReleaseResources] Failed to unload backend library: Failed to free library.
My environment :

To reproduce
Download the package from https://www.nuget.org/packages/Microsoft.ML.OnnxRuntime.QNN#readme-body-tab
Then, use the usual way, and set the backend to "QnnGpu.dll", to create onnxruntime network, would see the error which I pasted.
Urgency
No response
Platform
Windows
OS Version
Windows 11 Enterprise Insider Preview (Version: Dev)
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.22.0
ONNX Runtime API
C++
Architecture
ARM64
Execution Provider
Other / Unknown
Execution Provider Library Version
QNN library version is 2.33.2.0