onnx / onnxmltools

ONNXMLTools enables conversion of models to ONNX

Home Page:https://onnx.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ONNXRuntime failure on models post FP16 conversion through onnxmltools

ramkrishna2910 opened this issue · comments

A Pytorch Model converted to onnx using the script below successfully executes through ONNXRuntime.
But post FP16 conversion, the model fails with [ONNXRuntimeError] : 1 : FAIL : Type Error: Type parameter (T) of Optype (Add) bound to different types (tensor(float16) and tensor(float) in node ()
The failure is observed in other models as well.
Attached fp16 version of the model.
cg_graph_convolutions_43350e0e-op14-opt-f16.onnx.zip

Config:
Ubuntu 18.04
Onnxruntime 1.14.1
onnxmltools 1.11.2

import torch
from torch_geometric.datasets import Planetoid
from torch_geometric.nn import CGConv

dataset = Planetoid(root=".", name="Cora")
data = dataset[0]
edge_index_rows = 2

model = CGConv(dataset.num_features)
inputs = {
    "x": torch.zeros(data.num_nodes, data.num_features, dtype=torch.float),
    "edge_index": torch.zeros(edge_index_rows, data.num_nodes, dtype=torch.int64),
}
model.eval()
torch.onnx.export(model, inputs, "cg.onnx", opset_version=14, do_constant_folding=True, input_names = ['input'], output_names = ['output'])

This issue is related to the converter from torch to onnx. It should be posrted on the repository.

Can confirm - getting the same issue with the latest.

Has there been any fix to this? @ramkrishna2910