support dynamic batch-size inference in scrfd model?
chunniunai220ml opened this issue · comments
hi, i have read and test the codes
but i failed to generate dynamic batch-size inference in scrfd model
do u have any advices for me?
Hi! Dynamic batch inference is not yet supported.
Though you should be able to at least build TRT engine with batch support. Could you explain which errors you get?
Hi! Dynamic batch inference is not yet supported.
Though you should be able to at least build TRT engine with batch support. Could you explain which errors you get?
i have not finish TRT engine with batch support ops.
when i set batch-size =-1 in this repo:
File "/home/sharemnt/xielu/codes/trt-scrfd/InsightFace-REST/src/api_trt/modules/converters/onnx_to_trt.py", line 58, in _build_engine_onnx
profile.set_shape(input_name, (1, 3) + im_size, (1, 3) + im_size, (max_batch_size, 3) + im_size)
RuntimeError: Shape provided for max is inconsistent with other shapes.
when set bs=4:
the outputs shapes,in scrfd, is (bs,1) not (bs,9600,1),(bs,2400,1),(bs,600,1). wrong shapes
Have you tried running deploy_trt.sh
changing det_batch_size
?
when i set batch-size =-1 in this repo: File "/home/sharemnt/xielu/codes/trt-scrfd/InsightFace-REST/src/api_trt/modules/converters/onnx_to_trt.py", line 58, in _build_engine_onnx profile.set_shape(input_name, (1, 3) + im_size, (1, 3) + im_size, (max_batch_size, 3) + im_size) RuntimeError: Shape provided for max is inconsistent with other shapes.
TensorRT doesn't support fully dynamic dimensions, it expects non negative batch size. Resulting batch size would be any number in range (1,det_batch_size).
when set bs=4: the outputs shapes,in scrfd, is (bs,1) not (bs,9600,1),(bs,2400,1),(bs,600,1). wrong shapes
Are you trying to convert models automatically downloaded with latest version of this repo?
Original SCRFD models doesn't support batching at all.
SCRFD models now support batch inference, closing.