onnx / models

A collection of pre-trained, state-of-the-art models in the ONNX format

Home Page:http://onnx.ai/models/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Can't convert MaskRCNN from Matterport to onnx

assia855 opened this issue · comments

Hello,

I'm struglling in converting the model MaskRCNN from the repository Matterport to onnx. If you can provide code snippet or an end to end conversion example, that would be of great help.

@assia855, I was using tensorflow.keras.models.save_model to convert the model into SavedModel format, and then tf2onnx.convert to convert into ONNX. However it did not work with the original repository https://github.com/matterport/Mask_RCNN.
There are newer repositories like https://github.com/ahmedfgad/Mask-RCNN-TF2
and https://github.com/akTwelve/Mask_RCNN. The conversion worked for me in the context of the latter one, but inference runs only with 'CUDAExecutionProvider' (Linux) and 'CPUExecutionProvider'. It fails with 'DmlExecutionProvider' in Windows with "Non-zero status code returned while running ScatterElements node. Exception(6) tid(3c34) 80070057 The parameter is incorrect."

@SergeySandler Thank you so much for your reply and this information. So I will test those repositories.

@assia855, just to add to my earlier comments, we did not retrain our custom model with akTwelve/Mask_RCNN, and the prediction from our custom model trained with the original matterport/Mask_RCNN repository did not predict properly with akTwelve/Mask_RCNN.
With the attached mrcnn.zip the same custom model runs on the latest TansorFlow stack (like tensorflow/tensorflow:2.10.0-gpu-jupyter) and predicts as expected. However, as mentioned above, while it converts into ONNX format and onnx.checker.check_model() does not trigger any validation errors, it fails with 'DmlExecutionProvider' in Windows.
It would be interesting to know if you have the same experience with your model in ONNX runtime.