Add support for streaming with chains
aiswaryasankar opened this issue · comments
aiswaryasankar commented
Would be great to support streaming for Langchain chains - e.g.
streaming_handler = kwargs.get('streaming_handler')
model = ChatOpenAI(
model='gpt-3.5-turbo',
temperature=0.0,
verbose=True,
streaming=True, # Pass `streaming=True` to make sure the client receives the data.
callback_manager=CallbackManager(
[streaming_handler]
),
)
qa = ConversationalRetrievalChain.from_llm(model,retriever=retriever)
Deepankar Mahapatro commented
This is now supported by passing the callback manager to the chains as well.
langchain-serve/lcserve/apps/autogpt/helper.py
Lines 120 to 125 in 24da78d