minor issue: Batch size has to be set a factor of the training set size
Tgaaly opened this issue · comments
This is a minor issue: Batch size has to be set to a factor of the training set size when using SAE (haven't checked other models but I assume its the same). If my training set happens to be non-divisible by the set batch size - saetrain(.) fails. I assume a fix for this would be to handle it internally (inside saetrain(.)) by taking overlapping subsets for each batch.