WongKinYiu / YOLO

An MIT rewrite of YOLOv9

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Could you assist with the deployment?

MinGiSa opened this issue · comments

Thank you for converting the code to the MIT version and achieving such great results. If possible, could you also assist with the deployment? such as onnx, openvino, tensorrt.

Hi,

Part of the function has been implemented at 6777bd1

You can run it using the following command:

${...} task=inference task.fast_inference=onnx # or trt

to achieve inference with ONNX or TensorRT. I must admit that I'm not very familiar with ONNX, TensorRT, or OpenVINO.

The script will automatically generate the weight if it does not already exist. However, you will need to manually install the TensorRT or ONNX packages yourself for now.

In the next release, I plan to also upload the ONNX and TensorRT weights. Additionally, I will separate the requirement lists to accommodate dependencies on different platforms.

Please note that all the code is currently in an experimental stage.

Best regards,
Hao-Tang, Tsui