batch inference arcface model with tensorrt
nguyentrongnhat4869 opened this issue · comments
I'm trying to convert my onnx model from static input(1,3,112,112) to dynamic input(?,3,112,112) so that I can run the inference with batchsize. I see in your project that the function add_dyn_batch (in src/converters/modules/converters/insight2onnx.py) seems to do this before converting the model to trt. And I tried using that function to modify the dynamic input for the onnx model, the onnx model's static to dynamic input change worked, but when I used that modified model to convert to trt there was an error happening:
Error when parsing ONNX file
[TensorRT] ERROR: (Unnamed Layer * 480) [Shuffle]: at most one dimension may be inferred
Error when building an engine
[TensorRT] ERROR: Network must have at least one output
P/s: I have done the conversion from onnx-> trt with static input (1,3,112,112)
Do you have a solution for this problem?
Thank you very much
Hi, @nguyentrongnhat4869 ! add_dyn_batch
isn't used anywhere in project anymore and will be deleted, thanks!
Try using reshape_onnx.py instead.
Closing for inactivity, feel free to reopen if you have any questions left.