OpenNMT / CTranslate2

Fast inference engine for Transformer models

Home Page:https://opennmt.net/CTranslate2

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Yi-6B-Chat convert failed.

ninehills opened this issue · comments

$ ct2-transformers-converter --model ~/models/Yi-6B-Chat --output_dir ~/models/Yi-6B-Chat-CT2
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:01<00:00,  1.56it/s]
Traceback (most recent call last):
  File "/home/cynic/miniconda3/envs/openllm/bin/ct2-transformers-converter", line 8, in <module>
    sys.exit(main())
  File "/home/cynic/miniconda3/envs/openllm/lib/python3.10/site-packages/ctranslate2/converters/transformers.py", line 2008, in main
    converter.convert_from_args(args)
  File "/home/cynic/miniconda3/envs/openllm/lib/python3.10/site-packages/ctranslate2/converters/converter.py", line 50, in convert_from_args
    return self.convert(
  File "/home/cynic/miniconda3/envs/openllm/lib/python3.10/site-packages/ctranslate2/converters/converter.py", line 97, in convert
    model_spec.validate()
  File "/home/cynic/miniconda3/envs/openllm/lib/python3.10/site-packages/ctranslate2/specs/model_spec.py", line 590, in validate
    raise ValueError(
ValueError: Vocabulary has size 64002 but the model expected a vocabulary of size 64000