how do you test your inference time?
KHao123 opened this issue · comments
I test your model with the speed_test in https://github.com/facebookresearch/LeViT/blob/main/speed_test.py.
And get the result of
300 fps on 2080ti and 0.42 fps on cpu, which is inconsistent with your 25 fps.
Do you get the inference time on GPU or CPU?
Thank you.
The time is measured on a single RTX 2080Ti GPU with batch size 1.