mingyuliutw / UNIT

Unsupervised Image-to-Image Translation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Multiple GPU training?

harsmac opened this issue · comments

I have images with high resolution and I am running the training on GTX 1080p with 12 GB Ram. I reduced the random cropped images to 64x64 to let the training proceed. The GPU seems to run out of memory when the crop size is any greater than 64x64. Hence was wondering if it would be run on multiple GPU to speed up the training?

Mutliple GPU training is a bit tricky with the current implementation. Stay tuned. We plan to have a new iteration of the UNIT method, which would support multi GPU training and hopefully better quality.

Any update on this?

Hi @mingyuliutw I am also wondering about the multi-GPU implementation?