xuanandsix / CLRNet-onnxruntime-and-tensorrt-demo

This is the onnxruntime and tensorrt inference code for CLRNet: Cross Layer Refinement Network for Lane Detection (CVPR 2022). Official code: https://github.com/hongyliu/CLRNet

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

onnx inference time better than tensorrt inference time

mamadouDembele opened this issue · comments

Thanks for your amazing work. It seems the inference time of onnx model is better than the tensorrt model. Is there anything wrong with my testing ? I got 150 ms time inference for the onnx model and 770 ms for the tenssorrt model.

It may be that Tensorrt consumes more time when it is started for the first time. The usual method is to use multiple times inference to get the average value. For example, ten times of inferences in my environment as follows,

83182B6B-193D-4156-9EC5-6BD467182DCA