Why samples_per_gpu is set to 1?
buaazeus opened this issue · comments
zeus commented
Can it be samples_per_gpu = 2?
Thank you.
Jiazhi Yang commented
It means batch_size is 1 in training. You cannot set it to 2 (or higher) in the current version due to the complex data/tensor reshaping operations in each module. We will clean up the code to support larger batch sizes in the future.