justinpinkney / stable-diffusion

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Some confusion about the ckpt size

ssxxx1a opened this issue · comments

thanks for your work!
there are some confusion for the saved ckpt.
I used the example for finetune, but I noticed that the ckpt file size is 13.6G, but the official pretrained models from stable model are 7.2G. What is the problem?
My first thought was that it was pytorch AMP, but I had an error after setting up pytorch_lightning, so I wanted to ask first~
@justinpinkney

Probably because the checkpoint is saving the CLIP model weights which contains two models in it with it.

commented

It's about EMA parameter. The ckpt saves both model parameter and EMA parameter. The official pretrained weights removed the model parameter and kept EMA only.

It is optimizer state, delete it you can get a 7.2GB ckpt file.

It is optimizer state, delete it you can get a 7.2GB ckpt file.

How can I delete it? :)