[Feature][proxy adapter] This happens when the openai api started by vllm is having a conversation.
ArboterJams opened this issue · comments
ArboterJams commented
Search before asking
- I had searched in the issues and found no similar feature requirement.
Description
在大模型回答中出现<|im_end|><|im_start|>user类似的文字
Use case
想询问一下该如何修改代码可以让大模型回答保持正常
Related issues
No response
Feature Priority
None
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Kain Shu commented
自己手动尝试新增一行代码
file: dbgpt/model/proxy/llms/chatgpt.py
async def chatgpt_generate_stream(
model: ProxyModel, tokenizer, params, device, context_len=2048
):
client: OpenAILLMClient = model.proxy_llm_client
context = ModelRequestContext(stream=True, user_name=params.get("user_name"))
request = ModelRequest.build_request(
client.default_model,
messages=params["messages"],
temperature=params.get("temperature"),
context=context,
max_new_tokens=params.get("max_new_tokens"),
++ stop="<|im_end|>"
)
async for r in client.generate_stream(request):
yield r