InternLM / InternLM

Official release of InternLM2 7B and 20B base and chat models. 200K context support

Home Page:https://internlm.intern-ai.org.cn/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

InternLM2 下载模型后,用本地路径加载tokenizer和模型报错

syp1997 opened this issue · comments

Describe the bug

tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto", trust_remote_code=True)

从huggingface下载模型后,用本地路径加载tokenizer和模型报错:

加载tokenizer报错:
ValueError: Unrecognized configuration class <class 'transformers_modules.internlm2-chat-20b.configuration_internlm2.InternLM2Config'> to build an AutoTokenizer.

加载模型报错:
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory /data/suyinpei1/workspace/llms/internlm2-chat-20b.

Environment

torch 2.0.0
transformers 4.37.2

Other information

No response

是不是漏下载了什么文件啊,本地路径中有哪些文件?可以 ls 看看

感谢 确实是下漏了 现在已经解决🙏