alejandro-ao / ask-multiple-pdfs

A Langchain app that allows you to chat with multiple PDFs

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TypeError: can only concatenate str (not "tuple") to str

HagaiHen opened this issue · comments

On the first Question it work good. But after I sent the next question it throw an error:

2023-06-07 14:40:00.505 Uncaught app exception
Traceback (most recent call last):
  File "c:\Users\Hagai\Desktop\Projects\Chat With PDFs\.venv\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 552, in _run_script
    exec(code, module.__dict__)
  File "C:\Users\Hagai\Desktop\Projects\Chat With PDFs\app2.py", line 104, in <module>
    main()
  File "C:\Users\Hagai\Desktop\Projects\Chat With PDFs\app2.py", line 81, in main
    handle_userinput(user_question)
  File "C:\Users\Hagai\Desktop\Projects\Chat With PDFs\app2.py", line 55, in handle_userinput
    response = st.session_state.conversation({'question': user_question})
  File "c:\Users\Hagai\Desktop\Projects\Chat With PDFs\.venv\lib\site-packages\langchain\chains\base.py", line 116, in __call__
    raise e
  File "c:\Users\Hagai\Desktop\Projects\Chat With PDFs\.venv\lib\site-packages\langchain\chains\base.py", line 113, in __call__
    outputs = self._call(inputs)
  File "c:\Users\Hagai\Desktop\Projects\Chat With PDFs\.venv\lib\site-packages\langchain\chains\conversational_retrieval\base.py", line 71, in _call
    chat_history_str = get_chat_history(inputs["chat_history"])
  File "c:\Users\Hagai\Desktop\Projects\Chat With PDFs\.venv\lib\site-packages\langchain\chains\conversational_retrieval\base.py", line 25, in _get_chat_history     
    human = "Human: " + human_s
TypeError: can only concatenate str (not "tuple") to str

facing the same issue

same issue

So I fixed this problem: heres the thing that you to change such that it works.

Switch the functions 'handle_userinput' and 'get_conversation_chain' with those functions:

def get_conversation_chain(vectorstore):
    llm = ChatOpenAI()
    conversation_chain = ConversationalRetrievalChain.from_llm(
        llm=llm,
        retriever=vectorstore.as_retriever(),
    )
    return conversation_chain


def handle_userinput(user_question):
    response = st.session_state.conversation({"question": user_question, "chat_history": st.session_state.chat_history})    
    st.session_state.chat_history.append((user_question, response["answer"]))

    history =  st.session_state.chat_history[::-1]
    for message in history:
        st.write(bot_template.replace(
                "{{MSG}}", message[1]), unsafe_allow_html=True)
        st.write(user_template.replace(
                "{{MSG}}", message[0]), unsafe_allow_html=True)

on the main function: initilize 'chat_history' like that:

    if "chat_history" not in st.session_state:
        st.session_state.chat_history = []

Now, it shuold work.