Lightning-AI / lit-llama

Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Can I use Lightning fabirc to pre train llama2 on v100?

JerryDaHeLian opened this issue · comments

到底能不能在v100上使用Lightning fabirc 预训练llama2?