microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator

Home Page:https://onnxruntime.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

how to build onnxruntime with latest version of onnx

ZchiPitt opened this issue · comments

Describe the issue

we have a need to run bf16 model in ort, while onnx just added support for bf16 for many ops in Opset22 (onnx/onnx@4e7289d)

What is the best way to build the ORT with the latest onnx version? We tried by following https://github.com/microsoft/onnxruntime/blob/main/docs/How_To_Update_ONNX_Dev_Notes.md.

However we are still hitting the errors:
INVALID_GRAPH : Load model from down.onnx failed:This is an invalid model. In Node, ("/block/resnets.0/norm1/Reshape", Reshape, "", -1) : ("sample_in": tensor(bfloat16),"/block/resnets.0/norm1/Constant_output_0": tensor(int64),) -> ("/block/resnets.0/norm1/Reshape_output_0": tensor(bfloat16),) , Error No Op registered for Reshape with domain_version of 22

To reproduce

NA

Urgency

No response

Platform

Windows

OS Version

11

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

a0db218

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

See what change was added in #19745 for reshape. You need to do something similar for the new opset 22.