microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator

Home Page:https://onnxruntime.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Performance] INT32 CLIP Support on QNN

kory opened this issue · comments

Describe the issue

QNN supports INT32 CLIP (ReLUMinMax) via the QNN ONNX converter, but the QNN EP does not support CLIP.

To reproduce

See attached model.

Affects OpenAI_CLIP models.

Urgency

No response

Platform

Linux

OS Version

Ubuntu 22.04

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

Latest

ONNX Runtime API

Python

Architecture

ARM64

Execution Provider

QNN EP

Execution Provider Library Version

QNN 2.20

Model File

clip.onnx.zip

Is this a quantized model?

No