transformerlab / transformerlab-app

Open Source Application for Advanced LLM Engineering: interact, train, fine-tune, and evaluate large language models on your own computer.

Home Page:https://transformerlab.ai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Make it optional to fuse the adapter/lora

ai-made-approachable opened this issue · comments

When creating a training template add a configuration setting to decide if you want to fuse the adapter/lora after training into the base model or not, instead of always doing this.