daquexian / onnx-simplifier

Simplify your onnx model

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[BUG] Node (Reshape_2173) Op (Reshape) [ShapeInferenceError] Dimension could not be inferred: incompatible shapes

harrymore opened this issue · comments

Describe the bug
I use the mmdeploy framework to convert the Cascade Mask R-CNN (FPN) model to an ONNX model. After optimization with onnxsim, I perform inference using onnxruntime,then the error occured(the original ONNX model can perform inference normally):

Fail Traceback (most recent call last)
Cell In[9], line 3
1 import onnxruntime
----> 3 ort_session = onnxruntime.InferenceSession(onnx_sim_file, providers=['CUDAExecutionProvider', 'CPUExecutionProvider'])
4 input_data, ori_shape = cv_process(test_image, img_size, mean, std, size_divisor)
5 ort_inputs = {'input': input_data}

File /opt/conda/envs/py3.9/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419, in InferenceSession.init(self, path_or_bytes, sess_options, providers, provider_options, **kwargs)
416 disabled_optimizers = kwargs["disabled_optimizers"] if "disabled_optimizers" in kwargs else None
418 try:
--> 419 self._create_inference_session(providers, provider_options, disabled_optimizers)
420 except (ValueError, RuntimeError) as e:
421 if self._enable_fallback:

File /opt/conda/envs/py3.9/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:472, in InferenceSession._create_inference_session(self, providers, provider_options, disabled_optimizers)
469 self._register_ep_custom_ops(session_options, providers, provider_options, available_providers)
471 if self._model_path:
--> 472 sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
473 else:
474 sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)

Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from ../../model_zoo_common/disease_fp32_1.0.0.1_sim.onnx failed:Node (Reshape_2173) Op (Reshape) [ShapeInferenceError] Dimension could not be inferred: incompatible shapes

the Node Reshape_2173:

20240510154626

We using mmdeploy and have the same issue. Have you fixed it?