InternLM / InternLM

Official release of InternLM2 7B and 20B base and chat models. 200K context support

Home Page:https://internlm.intern-ai.org.cn/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug] `Unrecognized configuration class Error` returned by `AutoTokenizer.from_pretrained` with InternLM-chat-1.8b-sft (transformers==4.36)

openmmlab-bot opened this issue · comments

描述该错误

A new version of the following files was downloaded from https://huggingface.co/internlm/internlm2-chat-1_8b-sft:
- configuration_internlm2.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
Traceback (most recent call last):
  File "/home/lizhenxiang/tmp_1b.py", line 8, in <module>
    tokenizer = AutoTokenizer.from_pretrained("internlm/internlm2-chat-1_8b-sft", token=access_token, trust_remote_code=True)
  File "/home/lizhenxiang/df/envs/lmdeploy0.0.8/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 815, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers_modules.internlm.internlm2-chat-1_8b-sft.88b7e6ea474c4d8b9f6eb27cca6d79767593a46b.configuration_internlm2.InternLM2Config'> to build an AutoTokenizer.

环境信息

transformers: 4.36

其他信息

transformers: 4.34
transformers: 4.37
版本正常

This issue is marked as stale because it has been marked as invalid or awaiting response for 7 days without any further response. It will be closed in 7 days if the stale label is not removed or if there is no further response.

This issue is closed because it has been stale for 7 days. Please open a new issue if you have similar issues or you have any new updates now.