microsoft / TaskWeaver

A code-first agent framework for seamlessly planning and executing data analytics tasks.

Home Page:https://microsoft.github.io/TaskWeaver/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Not able run Taskweaver with LLM Qwen1.5-72B-Chat

Haxeebraja opened this issue · comments

Not able to run taskweaver with locally hosted Qwen1.5-72B-Chat.
Taskweaver worked fine with Qwen-72B-Chat.

Getting error:
Exception: OpenAI API request was invalid: Error code: 400 - {'object': 'error', 'message': 'top_p must be in (0, 1], got 0.0.', 'type': 'BadRequestError', 'param': None, 'code': 400}

Qwen hosted using vllm:
python -m vllm.entrypoints.openai.api_server --served-model-name Qwen1.5-72B-Chat --model Qwen/Qwen1.5-72B-Chat

Taskweaver config file:
{
"llm.api_base": "http://172.17.0.8:8283/v1",
"llm.api_key": "Null",
"llm.model": "Qwen1.5-72B-Chat",
"execution_service.kernel_mode": "local"}

Following call fine with ChatOpenAI in other application:
model = ChatOpenAI(
model_name = 'Qwen1.5-72B-Chat',
base_url = "http://172.17.0.8:8283/v1/",
api_key = "EMPTY",
temperature=0
)

The error message says that you need to configure a right top_p. In TaskWeaver, you can configure it by setting llm.openai.top_p in the config file.

Setting llm.openai.top_p made it work along with response format to text.