jason-li-831202 / YOLO-NAS-onnxruntime

This repo provides the C++ implementation of YOLO-NAS based on ONNXRuntime for performing object detection in real-time.Support float32/float16/int8 inference.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

When I try to run the code, I get always Ort::exception even running on image or video

ukicomputers opened this issue · comments

Hi! I have an converted (not custom, default COCO) Yolo-NAS-S model from default .pth to ONNX. When I execute the program by using following arguments:

./demo -m ~/models/dnn/yolo_nas_s.onnx -s 0 -c ~/Downloads/coco.names

I get this error when trying to execute on video detection (this is whole log):

[INFO ] Onnxruntime Version:16
[INFO ] Inference device: CPU
[INFO ] Inference model: /home/uki/models/dnn/yolo_nas_s.onnx
[INFO ] ---------------- Input info --------------
[INFO ] Name [0]: input.1
[INFO ] Shape [0]: (1, 3, 640, 640, )
[INFO ] --------------- Output info --------------
[INFO ] Name [0]: 913
[INFO ] Shape [0]: (1, 8400, 4, )
[INFO ] Name [1]: 904
[INFO ] Shape [1]: (1, 8400, 80, )
[INFO ] Class num: -1
[INFO ] ==========================================
[INFO ] Model was initialized.
[INFO ] Current FPS : 30
terminate called after throwing an instance of 'Ort::Exception'
  what():  Invalid Output Name:h���
Aborted (core dumped)

And this error with image, even if given image is properly working, ends .jpg as code needs, and it is tested with OpenCV test code, what gives proper results. This is also an full log:

[INFO ] =============== Model info ===============
[INFO ] Onnxruntime Version:16
[INFO ] Inference device: CPU
[INFO ] Inference model: /home/uki/models/dnn/yolo_nas_s.onnx
[INFO ] ---------------- Input info --------------
[INFO ] Name [0]: input.1
[INFO ] Shape [0]: (1, 3, 640, 640, )
[INFO ] --------------- Output info --------------
[INFO ] Name [0]: 913
[INFO ] Shape [0]: (1, 8400, 4, )
[INFO ] Name [1]: 904
[INFO ] Shape [1]: (1, 8400, 80, )
[INFO ] Class num: -1
[INFO ] ==========================================
[INFO ] Model was initialized.
terminate called after throwing an instance of 'Ort::Exception'
  what():  Invalid Output Name:eg;*.jpg;*.jpe)

I am using the latest version of ONNX Runtime, 1.16.3.
The ONNX file is tested, it works properly.

So when I debugged with Visual Studio Code, I get the error specifically at this line, when it tries to get an outputNames.size(), e.g. when it runs an session.Run (get detection results):

std::vector<Ort::Value> outputTensors = this->session.Run(Ort::RunOptions{nullptr},
                                                              inputNames.data(),
                                                              inputTensors.data(),
                                                              inputNames.size(),
                                                              outputNames.data(),
                                                              outputNames.size() );

Do you know what maybe is issue? Thanks in advance. If you need more info, please leave a comment.

use the provided convertPytorchToONNX.py script for the conversion as I adopted the same output format as the other YOLO versions, You can refer to the following log :

[INFO ] Name [0]: outputs
[INFO ] Shape [0]: (1, 8400, 85, )
[INFO ] Class num: 80