33B Chinese LLM, DPO QLORA, 100K context, AirLLM 70B inference with single 4GB GPU
Geek Repo:Geek Repo
Github PK Tool:Github PK Tool