advimman / lama

🦙 LaMa Image Inpainting, Resolution-robust Large Mask Inpainting with Fourier Convolutions, WACV 2022

Home Page:https://advimman.github.io/lama-project/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How can I reduce GPU memory usage?

sanbuphy opened this issue · comments

Hello, I think this model works very well, but the GPU memory usage is too high during refinement. Could you please advise me on how to reduce the memory usage during refinement? Thank you!

@sanbuphy You can reduce the GPU memory by changing the batch size while training the model.
Example: python bin/train.py -cn lama-fourier data.batch_size=8
You can adjust the batch size as per your memory usage.

How can I save memory during inference? If the size is fixed

commented

How can I save memory during inference? If the size is fixed

The same question about the GPU memory usage. I think you can try to do calculate some model on CPU