Lightning-AI / lit-llama

Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Converting from lit-llama to HF checkpoint?

jacqueline-he opened this issue · comments

Hello,
Thanks for the great work! I have a pre-trained Lit-Llama checkpoint that I'd like to convert to a format supported by HF, so that I could use it as an off-the-shelf model for other evaluation suites (e.g., lm-eval-harness). I'm wondering if this is currently possible? From snooping around, it looks like there's been some work in #150 but not sure what came of it. Thank you in advance!