eosphoros-ai / DB-GPT

AI Native Data App Development framework with AWEL(Agentic Workflow Expression Language) and Agents

Home Page:https://docs.dbgpt.site

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug] [model] Load qwen1.5-7b-chat success but can't use it

adogwangwang opened this issue · comments

Search before asking

  • I had searched in the issues and found no similar issues.

Operating system information

Linux

Python version information

=3.11

DB-GPT version

main

Related scenes

  • Chat Data
  • Chat Excel
  • Chat DB
  • Chat Knowledge
  • Model Management
  • Dashboard
  • Plugins

Installation Information

Device information

v100

Models information

qwen1.5-7b-chat

What happened

当我使用本地的qwen1.5模型时,可以成功load,而且成功注册到controller,curl 127.0.0.1:5670/api/v1/worker/model/list可以正确显示出信息{"success":true,"err_code":null,"err_msg":null,"data":[{"model_name":"qwen1.5-7b-chat","model_type":"llm","host":"10.244.46.157","port":5670,"manager_host":"10.244.46.157","manager_port":5670,"healthy":true,"check_healthy":true,"prompt_template":null,"last_heartbeat":"2024-06-03T08:11:54.923479","stream_api":null,"nostream_api":null}]} ,但是我使用首页的大模型问答功能,报错Traceback (most recent call last):
File "/app/dbgpt/app/scene/base_chat.py", line 286, in stream_call
async for output in self.call_streaming_operator(payload):
File "/app/dbgpt/app/scene/base_chat.py", line 172, in call_streaming_operator
async for out in await llm_task.call_stream(call_data=request):
File "/app/dbgpt/storage/cache/operators.py", line 193, in transform_stream
async for out in input_value:
File "/app/dbgpt/core/interface/operators/llm_operator.py", line 323, in streamify
async for output in self.llm_client.generate_stream(request): # type: ignore
File "/app/dbgpt/model/cluster/client.py", line 88, in generate_stream
request = await self.covert_message(request, message_converter)
File "/app/dbgpt/core/interface/llm.py", line 816, in covert_message
model_metadata = await self.get_model_metadata(request.model)
File "/app/dbgpt/core/interface/llm.py", line 858, in get_model_metadata
raise ValueError(f"Model {model} not found")
ValueError: Model qwen1.5-7b-chat not found
2024-06-03 08:20:23 dbgpt-6cf59c4694-q4mzp dbgpt.app.scene.base_chat[1] ERROR model response parse failed!Model qwen1.5-7b-chat not found , 找不到这个模型的metadata ,为什么?怎么解决呢?

What you expected to happen

提供可以正确使用本地模型的qwen1.5模型的方法

How to reproduce

see error

Additional context

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Please run following command for more details:

dbgpt model list

This issue has been marked as stale, because it has been over 30 days without any activity.

This issue bas been closed, because it has been marked as stale and there has been no activity for over 7 days.