[Bug] ConversationalRetrievalChain is not able to retain memory.
aayushxrj opened this issue · comments
I have implemented the code for ConversationalRetrievalChain as per the official documentation provided by Chainlit, but it seems that the memory retention functionality is not working as expected. Despite following the guidelines, the bot fails to retain memory, which impacts its conversational capabilities.
Approaches to Solve this Issue:
1. Use LLMChain
2. Use load_qa_chain
3. Use RunnablePassthrough (Using Approach 1 from app.py)
Commit 9857324 resolves the issue.
The solution was implemented according to Approach 3 outlined in the issue description.
For further details, please refer to the following documentation:
LangChain's official docs for Q/A with RAG: Add Chat History
LangChain's official docs for ainvoke()
LangChain's official docs for Callbacks
Github Issue mentioning usage of config in ainvoke()