microsoft / i-Code

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Cuda out of memory

CHY-coder opened this issue · comments

Hi, dear authors:
Thanks for sharing the great work. I get the massage of "Cuda out of memory" when I run demo.ipynb

"CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 23.69 GiB total capacity; 20.70 GiB already allocated; 3.00 MiB free; 21.07 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF"

I have 2 RTX 4090 with 24g GPU, but it seems only use 1 RTX 4090. How can i use 2 gpu when i run demo.ipynb.
Besides i use fp16 checkpoint.

This problem can be fixed by setting fp16=True.