arielnlee / Platypus

Code for fine-tuning Platypus fam LLMs using LoRA

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Using model through HF

shubhamagarwal92 opened this issue · comments

Hi!

Thanks for open-sourcing the code! Is it available through the HF as:

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("garage-bAInd/Platypus2-70B")
model = AutoModelForCausalLM.from_pretrained("garage-bAInd/Platypus2-70B", trust_remote_code=True, torch_dtype=torch.float16)