how to use learner.distributed(), in self supervised pretrain code ?
lileishitou opened this issue · comments
lilei commented
how to use self supervised pretrain code to train on multiple GPUS or multi-node ?
I want to user revised the code for multinode or multiple GPUS for large dataset and large parameters.
But not successed