jina-ai / langchain-serve

⚡ Langchain apps in production using Jina & FastAPI

Home Page:https://cloud.jina.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add support for streaming with chains

aiswaryasankar opened this issue · comments

Would be great to support streaming for Langchain chains - e.g.

streaming_handler = kwargs.get('streaming_handler')

    model = ChatOpenAI(
        model='gpt-3.5-turbo',
        temperature=0.0,
        verbose=True,
        streaming=True,  # Pass `streaming=True` to make sure the client receives the data.
        callback_manager=CallbackManager(
            [streaming_handler]
        ),
    )
    qa = ConversationalRetrievalChain.from_llm(model,retriever=retriever)

This is now supported by passing the callback manager to the chains as well.

chain = LLMChain(
llm=llm,
prompt=prompt,
verbose=True,
callback_manager=llm.callback_manager,
)