[Feature Request]: Please support stream_chat for vllm
DavidLetGo opened this issue · comments
DavidLetGo commented
Feature Description
Need to implement the stream_chat function.
class Vllm(LLM):
@llm_chat_callback()
def stream_chat(
self, messages: Sequence[ChatMessage], **kwargs: Any
) -> ChatResponseGen:
raise (ValueError("Not Implemented"))
Reason
No response
Value of Feature
vLLM is popular and widely used. Is it possible to integrate it to this project?
DavidLetGo commented
Anyone will help?