Speed up the training
rassaire opened this issue · comments
Jean-Rassaire commented
Hi again,
I remenber you mentioned that keeping the batch size at 1 aids in model convergence. Typically, we increase the batch size when we have high computing resources to accelerate learning. Given that we will maintain a batch size of 1, how can we speed up training with a large number of training datasets?
mobaidoctor commented
Hi @rassaire, It looks like there might have been a misunderstanding. Could you please review my previous response? #26 (comment). If you still have questions or need further clarification, feel free to let us know. Thank you!
Jean-Rassaire commented
@mobaidoctor, Your comment was clear; I misinterpreted it. I apologize.