Multiple GPU training?
harsmac opened this issue · comments
Harshitha Machiraju commented
I have images with high resolution and I am running the training on GTX 1080p with 12 GB Ram. I reduced the random cropped images to 64x64 to let the training proceed. The GPU seems to run out of memory when the crop size is any greater than 64x64. Hence was wondering if it would be run on multiple GPU to speed up the training?
Ming-Yu Liu 劉洺堉 commented
Mutliple GPU training is a bit tricky with the current implementation. Stay tuned. We plan to have a new iteration of the UNIT method, which would support multi GPU training and hopefully better quality.
flagman commented
Any update on this?
Jakub Langr commented
Hi @mingyuliutw I am also wondering about the multi-GPU implementation?