aayushxrj / Pluto.ai

A conversational Retrieval Augmented Generation (RAG) chatbot built with LangChain, Chainlit, and the Claude API for question-answering from documents.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug] ConversationalRetrievalChain is not able to retain memory.

aayushxrj opened this issue · comments

I have implemented the code for ConversationalRetrievalChain as per the official documentation provided by Chainlit, but it seems that the memory retention functionality is not working as expected. Despite following the guidelines, the bot fails to retain memory, which impacts its conversational capabilities.

Helpful Medium blog

Approaches to Solve this Issue:

1. Use LLMChain

2. Use load_qa_chain

  • Github Issue Docs: Link
  • Stack Overflow: Link
  • Example Usage: Link

3. Use RunnablePassthrough (Using Approach 1 from app.py)

  • Official Docs 1: Link
  • Official Docs 2: Link

Commit 9857324 resolves the issue.

The solution was implemented according to Approach 3 outlined in the issue description.

For further details, please refer to the following documentation:
LangChain's official docs for Q/A with RAG: Add Chat History
LangChain's official docs for ainvoke()
LangChain's official docs for Callbacks
Github Issue mentioning usage of config in ainvoke()