onnx / models

A collection of pre-trained, state-of-the-art models in the ONNX format

Home Page:http://onnx.ai/models/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Cannot convert MaskRCNN onnx model to TensorRT

oriolorra opened this issue · comments

Bug Report

Which model does this pertain to?

MaskRCNN-10.onnx

Describe the bug

I am trying to use tensorrt as a backend with onnx_tensorrt. I have this little piece of python code

import onnx
import onnx_tensorrt.backend as backend
import numpy as np

filename = "MaskRCNN-10.onnx"
model = onnx.load(filename)
onnx.checker.check_model(model)


model = onnx.load(filename)
engine = backend.prepare(model, device='CUDA:0')

Then, I have this error:

RuntimeError: While parsing node number 902:
ModelImporter.cpp:168 In function parseGraph:
[6] Invalid Node - 908
This version of TensorRT only supports input K as an initializer. Try applying constant folding on the model using Polygraphy: https://github.com/NVIDIA/TensorRT/tree/master/tools/Polygraphy/examples/cli/surgeon/02_folding_constants

So, I try what it says, and execute this command:

polygraphy surgeon sanitize MaskRCNN-10.onnx --fold-constants  -o folded.onnx

But, after execute this python code

import onnx
import onnx_tensorrt.backend as backend
import numpy as np

filename = "MaskRCNN-10.onnx"
model = onnx.load(filename)
onnx.checker.check_model(model)


model = onnx.load("folded10.onnx")
engine = backend.prepare(model, device='CUDA:0')

Says more or less the same error:

RuntimeError: While parsing node number 609:
ModelImporter.cpp:168 In function parseGraph:
[6] Invalid Node - 908
This version of TensorRT only supports input K as an initializer. Try applying constant folding on the model using Polygraphy: https://github.com/NVIDIA/TensorRT/tree/master/tools/Polygraphy/examples/cli/surgeon/02_folding_constants

Reproduction instructions

System Information

Ubuntu 22.04
Python 3.10.6
Onnx 1.14.0
OnnxTensorRT 8.5.1

Notes

Any additional information