Multiple streams with batched inference
Huy0110 opened this issue · comments
Currently, I don't see support for multiple streams with batched inference. In case I have many streams to process, I think batch processing will be faster and more optimal than creating many yolov8 objects.