onnx inference time better than tensorrt inference time
mamadouDembele opened this issue · comments
Mamadou commented
Thanks for your amazing work. It seems the inference time of onnx model is better than the tensorrt model. Is there anything wrong with my testing ? I got 150 ms time inference for the onnx model and 770 ms for the tenssorrt model.
xuanandsix commented