microsoft / ContextualSP

Multiple paper open-source codes of the Microsoft Research Asia DKI group

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

semantic_parsing_in_context cuda out of memory

eche043 opened this issue · comments

Hi there.
I run the training via colab (I have a 16GB GPU memory).
I'm using concat.none.jsonnet for BERT and getting a "CUDA out of memory" error at 54% of epoch 0
I would like to know the amount of memory needed to be able to launch the training based on BERT or if there is a way to do it with the 16GB
Thanks
Capture d’écran (15)

Capture d’écran (16)

problem solved. I needed at least 17 GB of GPU RAM