This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
how to build onnxruntime with latest version of onnx #20615
Labels
platform:windows
issues related to the Windows platform
Describe the issue
we have a need to run bf16 model in ort, while onnx just added support for bf16 for many ops in Opset22 (onnx/onnx@4e7289d)
What is the best way to build the ORT with the latest onnx version? We tried by following https://github.com/microsoft/onnxruntime/blob/main/docs/How_To_Update_ONNX_Dev_Notes.md.
However we are still hitting the errors:
INVALID_GRAPH : Load model from down.onnx failed:This is an invalid model. In Node, ("/block/resnets.0/norm1/Reshape", Reshape, "", -1) : ("sample_in": tensor(bfloat16),"/block/resnets.0/norm1/Constant_output_0": tensor(int64),) -> ("/block/resnets.0/norm1/Reshape_output_0": tensor(bfloat16),) , Error No Op registered for Reshape with domain_version of 22
To reproduce
NA
Urgency
No response
Platform
Windows
OS Version
11
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
a0db218
ONNX Runtime API
Python
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: