[Enhancement]: Support Llama-2 in LangChain mode
jaelgu opened this issue · comments
What would you like to be added?
Akcio offers two options to build the system: langchain or towhee.
The option using Towhee has already supported Llama-2 as LLM.
To support Llama-2 for LangChain, we need to add a llama_2_chat.py
under https://github.com/zilliztech/akcio/tree/main/src_langchain/llm.
With llama-2 supported in LangChain mode, the following steps should start service successfully:
- Set up
Change to your own Milvus & Postgres connection details (modify
config.py
if needed.)
$ export LLM_OPTION=llama_2
$ export MILVUS_URI=https://localhost:19530
$ export SQL_URI=postgresql://postgres:postgres@localhost/chat_history
- Start service
$ python main.py --langchain
OR start gradio demo:
$ python gradio_demo.py --langchain
Why is this needed?
No response
Anything else?
No response
Hello! I am new to this type of issue and I would like to work on this.
Please assign me this project #58.
Hello! I am new to this type of issue and I would like to work on this.
Please assign me this project #58.
Cool! I've assigned you this issue.
I have gone through several documentations related to this issue like - LLM, Langchain etc.
Still I am having a difficulties :
- How to create "llama_2_chat.py" script.
- What other documentations or topics I need to focus on, in order to solve this issue.
- Please explain that what this project is all about and what is the main motive to solve this issue.
I have gone through several documentations related to this issue like - LLM, Langchain etc. Still I am having a difficulties :
- How to create "llama_2_chat.py" script.
- What other documentations or topics I need to focus on, in order to solve this issue.
- Please explain that what this project is all about and what is the main motive to solve this issue.
You can check out the description of this issue. The target is to successfully start the chatbot service using commands in the description.
- To create
llama_2_chat.py
, you can refer to any llm script under https://github.com/zilliztech/akcio/tree/main/src_langchain/llm, where the llama_2_chat.py should be under. - Here is contributing guide: https://github.com/zilliztech/akcio/blob/main/Contributing.md
- Once you submit a pr, I will review code and merge it.
@jaelgu @LovishGarg2004 Addressing this issue here #70
@jaelgu @LovishGarg2004 Addressing this issue here #70
- You can go ahead with it. 👍
- I'm dropping this from my side due to some issues on my side.