Is it possible to run this model on multiple GPUs?
youssefavx opened this issue · comments
youssefavx commented
(Talking of NVSR) I ran this model on a cloud GPU A100 80 GB and managed to get it up to 7 minutes to upsample. I'm now curious how far it can go. If I have 8x A100 GPUs would it be possible to be able to upsample a 56 min file? Or is this model not designed to run inference on multiple GPUs?
(I know my method so far has been splitting long audio files then upsampling but I'd like to avoid splitting).