This README file provides instructions on how to set up and run a chat application using the Langchain and Streamlit libraries. This application allows users to input questions, which are then answered by an AI language model.
- Create a virtual environment:
python -m venv env
source env/bin/activate # On Windows: env\Scripts\activate
- Activate the virtual environment and install the required packages:
pip install streamlit langchain chat-models callbacks
-
Update
LLM_MODEL
andLLM_BASE_URL
with the correct values for your environment. For this example, use "llama2:latest" and "localhost:11434". -
Open a terminal or command prompt.
-
Navigate to the directory containing your Streamlit Python script.
-
Run `streamlit run chat.py This will start the Streamlit server from that script.