CPU rate is high during inference
alsabahi2030 opened this issue · comments
kemo commented
It seems like the gector model is using multiprocessing during inference. I have checked the code to find some relevant info, but I couldn't make sure of this. Your help is appreciated!
Alex Skurzhanskyi commented
This is expected as you should tokenize input data during training.