tinapan-pt / VideoMoCo

Official pytorch implementation of paper "VideoMoCo: Contrastive Video Representation Learning with Temporally Adversarial Examples" (CVPR 2021).

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

OOM error when using more than 24 GPUs

rtygbwwwerr opened this issue · comments

It's a very intresting work, but I was wondering how many GPUs when you're training videoMoco? we found if use more than 24 GPUs(v100 32G), the training processing would be blocked by GPU OOM issue in "_batch_shuffle" function

Hi, we use 8GPUs(v100).