PaddlePaddle / PaddleOCR

Awesome multilingual OCR toolkits based on PaddlePaddle (practical ultra lightweight OCR system, support 80+ languages recognition, provide data annotation and synthesis tools, support training and deployment among server, mobile, embedded and IoT devices)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

cpp infer使用 OnnxRuntime报错

ANDROIDTODO opened this issue · comments

Hi,
我使用cpp infer 时开启OnnxRuntime
paddle_infer::Config config; config.SetModel(model_dir + "/inference.pdmodel", model_dir + "/inference.pdiparams"); config.EnableONNXRuntime(); config.EnableORTOptimization();
报错:
W0622 16:32:11.296285 14620 analysis_predictor.cc:1780] Paddle2ONNX do't support convert the Model, fall back to using Paddle Inference.

我使用的模型是 https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar
paddle inference 版本是2.3.0

请问这个时什么情况?

没有测试过config.EnableONNXRuntime() 的功能;看报错是paddle2onnx转PPOCRv3识别模型失败,单独使用paddle2onnx 0.8.0版本 转onnx模型是可以的
paddle2onnx --model_dir ./ch_PP-OCRv3_rec_infer \
--model_filename inference.pdmodel
--params_filename inference.pdiparams
--save_file ./inference/rec_onnx/model.onnx
--opset_version 10

你可以在PaddleONNX 仓库下提issue问下:https://github.com/PaddlePaddle/Paddle2ONNX