lucidrains / lion-pytorch

🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Same amount of VRAM is taken as in AdamW

VCasecnikovs opened this issue · comments

One of the main benefits of LION, is it needs to save less data for each param.
Adam needs to save Momentum and RMSProp ema's, while in LION we need to save only momentum ema.
When I try to use LION, it takes exactly the same amount of memory as AdamW

Hi, what is the model size in your setting?
When the model is small, I think the main memory overhead comes from the activation, so the saved second moment may not be significant.

Are you comparing this to AdamW8bit by chance?

In my setting, Lion takes less memory than AdamW (9.9 Gb vs 10.1Gb) but Lion is slower in terms of steps/sec. Has anyone noticed the same? I compare Lion with triton vs fused AdamW.

do you solve the problem? i have the same problem.