Specify providers in onnxruntime.InferenceSession
benedetto-michelozzi opened this issue · comments
from onnxruntime 1.9 it's necessary to specify the execution provider in InferenceSession
"ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)"
As an example, in my specific case, the onnx_infer.py, row #11 should be modified as:
self.session = onnxruntime.InferenceSession(model, providers=['CUDAExecutionProvider', 'CPUExecutionProvider'])