Xwin-LM / Xwin-LM

Xwin-LM: Powerful, Stable, and Reproducible LLM Alignment

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This modle can be infered/finetuned by Fastchat ?

wanzhu666666 opened this issue · comments

Hi @wanzhu666666!
Since Xwin-LM-V0.1 series of models are finetuned from Llama2 pretrained models, we believe they can be directly infered/finetuned by Fastchat repo.

Hi @wanzhu666666! Since Xwin-LM-V0.1 series of models are finetuned from Llama2 pretrained models, we believe they can be directly infered/finetuned by Fastchat repo.

One kind reminder is that you need to use exactly the same conversation template mentioned here in the README.md, which is actually the same as Vicuna~