Lightning-AI / litgpt

Pretrain, finetune, deploy 20+ LLMs on your own data. Uses state-of-the-art techniques: flash attention, FSDP, 4-bit, LoRA, and more.

Home Page:https://lightning.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Customizable loss function & inference step?

Boltzmachine opened this issue · comments

The high level of customization offered by PyTorch Lightning is why I opted for PyTorch Lightning to train LMs over HuggingFace's stupid Trainer. However, it seems that this library once again consolidates everything into single command lines, thereby sacrificing a lot flexibility. Wonder if there is a way to customize the loss function and the inference process.

Thanks for the feedback! The repo originally started out as self-contained scripts, and then we gradually transitioned this to the command line interface you are seeing now because this usage is the easiest for most people (incl. non-coders).

But like you said, it would be nice to also offer other ways to use the code, and eventually, we may create a Python interface. Thanks for suggesting!

Thanks for your feedback! Lightning is the best deep-learning training framework. Really hope it could be even better than huggingface when training LMs.