Lightning-AI / pytorch-lightning

Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.

Home Page:https://lightning.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support get optimizer and lr_schedulers from deepspeed config

npuichigo opened this issue · comments

Description & Motivation

Currently it's not supported. However in huggingface/accelerate, users can specify DummyOptimizer and DummyScheduler to indicate they want them to be replaced by those from deepspeed config.

Pitch

Lightning should have some way to let people use optimizers from deepspeed config, like OneBitAdam or something else that cannot be manually instantiated.

self._deepspeed_engine, optimizer = self._initialize_engine(module, optimizers[0])

Alternatives

No response

Additional context

No response

cc @Borda