about the required memory size of GPU and training time
amiltonwong opened this issue · comments
Hi, @Atcold ,
Thanks for releasing such a great package and I'm very interested in this work and want to have a practice on it. Could you reply the required memory size of GPU (I only have 980Ti , 6GB) to run the training process? Also could you elaborate on how long you trained your network in each settings (unsupervised and supervised) and which GPUs you used?
Thanks!
6
GB are enough for the models I am proposing in the paper.
Training time was around 40 minutes per epoch, and I've trained the network up to 30 epochs, so for at most 20 hours per experiment.
My 6 GPUs are GeForce GTX TITAN X, and they fit nicely two models each (they have 12
GB of RAM), so I was running 12 experiments every night.
Please, let me know if you need any other detail.