breezedeus / CnOCR

CnOCR: Awesome Chinese/English OCR Python toolkits based on PyTorch. It comes with 20+ well-trained models for different application scenarios and can be used directly after installation. 【基于 PyTorch/MXNet 的中文/英文 OCR Python 包。】

Home Page:https://www.breezedeus.com/article/cnocr

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)

onui05211 opened this issue · comments

我在互联网搜索解决方法,尝试了很多,但是始终无法解决修改过
model = onnxruntime.InferenceSession(self._model_fp)
onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider']
请问各位大佬有遇到过这种情况吗?

此问题已经得到解决,根据他的提示,我在recognizer.py第187行 ,更改model = onnxruntime.InferenceSession(self._model_fp)model = onnxruntime.InferenceSession(self._model_fp,providers=['CUDAExecutionProvider', 'CPUExecutionProvider'])
utility.py157行更改sess = ort.InferenceSession(model_file_path)sess = ort.InferenceSession(model_file_path,providers=['TensorrtExecutionProvider,CUDAExecutionProvider','CPUExecutionProvider'])

相同安装步骤,有的PC上没有问题,另一些则报这种错误。请问是什么原因造成?
针对这种情况,会不会对代码作修改?

应该把onnxruntime版本降到 <1.16 就可以。onnxruntime新版接口做了调整,过几天会发布新版cnocr