onnx inference
1234269 opened this issue · comments
error:
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Scan node. Name:'custom_rnn_scan_Scan__25' Status Message: Non-zero status code returned while running GreaterOrEqual node. Name:'bidirectional_rnn/bw/bw/while/GreaterEqual_2' Status Message: /onnxruntime_src/include/onnxruntime/core/framework/op_kernel_context.h:42 const T* onnxruntime::OpKernelContext::Input(int) const [with T = onnxruntime::Tensor] Missing Input: bidirectional_rnn/bw/ToInt32:0
could you give some tips for inferencing onnx?
ONNX is not supported, project is unmaintained