Inference time part-based person redientification
faizan1234567 opened this issue · comments
Dear Author,
Thanks for the amazing work. I would like to know is there any data relevant to inference time? Does your model support real inference on a video?
Hi, the inference time should be mainly related to the backbone used, the attention head presented in the paper shouldn't induce too much computational overhead (although I don't have precise numbers about this). If HRNet32 is too heavy in terms of computation speed for your inference hardware, maybe you should try a smaller backbone.