AI4Finance-Foundation / FinGPT

FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace.

Home Page:https://ai4finance.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CUDA out of memory on Google Colab when trying to run beginners notebook

mithril9 opened this issue · comments

Hi,

I keep getting

OutOfMemoryError: CUDA out of memory. Tried to allocate 508.00 MiB. GPU 0 has a total capacty of 15.77 GiB of which 30.12 MiB is free. Process 44331 has 15.74 GiB memory in use. Of the allocated memory 14.89 GiB is allocated by PyTorch, and 1.11 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

When trying to run

model = AutoModel.from_pretrained( model_name, quantization_config=q_config, trust_remote_code=True, device='cuda' )

I have paid for 100 compute units and am using A100 GPU as the session type. I also tried changing the batch size from 4 to 1 but that didn't help.

The above is when trying to run your beginners Colab notebook.

Please try to reduce batch size or torch.cuda.empty_cache() to adjust. You can also use nvidia-smi to oversee what's going on and adjust the model based on your GPU. I ran the beginner script before. You may also refer to my repo and articles here:https://github.com/AI4Finance-Foundation/FinGPT-Research. Hope this will help.