model format
hhy-joseph opened this issue · comments
I'm trying to run with meta-llama/Llama-2-13b-chat, but got the error below:
OSError: ./llama/llama-2-13b-chat does not appear to have a file named config.json.
Inside Llama-2-13b-chat/ checklist.chk, consolidated.00.pth, consolidated.01.pth. params.jon
I think the model should be Llama-2-13b-chat-hf but not Llama-2-13b-hf
I think this is an hf issue but let me know if I'm wrong