[Bug][model] VLLM run Qwen-14B-Chat with error: AttributeError: 'TokenizerGroup' object has no attribute 'eos_token_id'
TuDaCheng opened this issue · comments
Search before asking
- I had searched in the issues and found no similar issues.
Operating system information
Linux
Python version information
3.10
DB-GPT version
latest release
Related scenes
- Chat Data
- Chat Excel
- Chat DB
- Chat Knowledge
- Model Management
- Dashboard
- Plugins
Installation Information
-
AutoDL Image
-
Other
Device information
A100显卡
Models information
Qwen-14B-Chat
What happened
使用VLLM推理报错AttributeError: 'TokenizerGroup' object has no attribute 'eos_token_id'
What you expected to happen
希望找到解决办法
How to reproduce
1
Additional context
1
Are you willing to submit PR?
- Yes I am willing to submit a PR!
具体报错信息2024-06-04 18:26:05 pjxawlenwmqxsmes-snow-d7ffd786f-554mp dbgpt.model.cluster.worker.default_worker[3034] ERROR Model inference error, detail: Traceback (most recent call last):
File "/data/TCC/DB-GPT/dbgpt/model/cluster/worker/default_worker.py", line 246, in async_generate_stream
async for output in generate_stream_func(
File "/data/TCC/DB-GPT/dbgpt/model/llm_out/vllm_llm.py", line 26, in generate_stream
if tokenizer.eos_token_id is not None:
AttributeError: 'TokenizerGroup' object has no attribute 'eos_token_id'
It is a bug after vllm >= 0.2.7, I will fix it.