Training question
DawenZhouCver opened this issue · comments
DawenZhouCver commented
Thanks for your amazing work! I have a question regarding your training:
How many gpus did you use to train parallel? How many hours do it need to early stop?
Jingyun Liang commented
For your reference, it needs about 1.8 days on 8 x 2080 Ti GPU for 500K iterations (batch size = 32, patch size = 48x48).