Using model through HF
shubhamagarwal92 opened this issue · comments
Hi!
Thanks for open-sourcing the code! Is it available through the HF as:
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("garage-bAInd/Platypus2-70B")
model = AutoModelForCausalLM.from_pretrained("garage-bAInd/Platypus2-70B", trust_remote_code=True, torch_dtype=torch.float16)