Crash with certain models in C++ DirectML, but not in Python DirectML #16564
Labels
ep:DML
issues related to the DirectML execution provider
platform:windows
issues related to the Windows platform
Describe the issue
Certain ONNX models work (load and inference) when using the DirectML provider in C++, but others don't, they either crash on load or on inference. However loading and infering the same models using the DirectML provider in Python works.
To reproduce
model1.onnx: Crash on Load
You can download model1.onnx from here.
In Python, this loads and infere fine with DirectML:
However in C++, it crashes during loading:
crash with the following message:
model2.onnx: Crash on Inference
You can download model2.onnx from here.
In Python, this loads and infere fine with DirectML:
However in C++, it crashes during inference:
crash with the following message:
C++ Library Used
https://github.com/microsoft/onnxruntime/releases/download/v1.15.1/Microsoft.ML.OnnxRuntime.DirectML.1.15.1.zip
Urgency
Serious, as I can't offer hardware acceleration on Windows for my C++ software. CPU inference is much slower.
Platform
Windows
OS Version
11
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.15.1
ONNX Runtime API
C++
Architecture
X64
Execution Provider
DirectML
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: