About the batch size.
kevinbro96 opened this issue · comments
kevinbro96 commented
I find that both in the code of WCT and in OST, the batch size is set to 1. It limits the inference speed.
Can we set the batch size to 64? In another words, can we run the WCT and OST distributely?