dataelement / bisheng

Bisheng is an open LLM devops platform for next generation AI applications.

Home Page:https://bisheng.dataelem.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The condensed question will be streaming output at the chat dialog

yueluu opened this issue · comments

I dont want to expose the condense_question_llm parameter. do you have a perfect practice to set the default condense_question_llm,use the llm object and set stream False

image
i update the code,maybe it worked