AUTOMATIC1111 / stable-diffusion-webui-tensorrt

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to clear pytorch base model GPU memory? Tensorrt GPU memory larger than pytorch.

WudiJoey opened this issue · comments

What happened?

When I launch webui, it costs 3G GPU memory at the begining even if i do nothing. And when I use base+lora model, it increases to 5G and decreases to 3G after generating. But when i switch sd-unet to tensort models in [Stable Diffuison] in settings, the hightest memory reach 6G.
I guess the 3G is the pytorch memory of base model, so if i only needs trt inference, how do i clear it from my GPU?

Steps to reproduce the problem

run watch -n 1 -d nvidia-smi to watch how gpu memory changes.
Go to [Setting]
Press [Stable Diffusion], change SD Unet to [TRT models], press [Apply settings]
Go to txt2img and test!

What should have happened?

when using trt mode, clear all pytorch gpu memory.

its 4GB minimum for generating