This Chainlit app provides a sandbox environment for chatting with various open-source large language models (LLMs) hosted on RunPod. The application leverages Chainlit and OpenAI's API to facilitate interactions with the models.
- Streamed Responses: Get responses from the LLM in real-time as they are generated.
- Session Management: Maintains conversation history for context-aware responses.
- Customization: Easily switch between different models and adjust parameters such as temperature and maximum tokens.
- Python 3.7+
- RunPod API Key
-
Install pipx: Follow the instructions on the pipx installation page.
-
Install Poetry: Follow the instructions on the Poetry installation page.
-
Clone the Repository:
git clone <repository-url> cd <repository-directory>
-
Install Dependencies:
poetry install
-
Activate the Virtual Environment:
poetry shell
If you don't want to use poetry, you can manage dependencies manually.
-
Create Virtual Environment:
python -m venv myenv source myenv/bin/activate
-
Install Dependencies:
pip install chainlit openai
-
Set Your RunPod API Key: Replace
<YOUR_RUNPOD_API_KEY>
in the script with your actual RunPod API key. -
Specify the Model: Update the
model
variable with the desired model name, e.g.,"mistralai/Mistral-7B-Instruct-v0.2"
.
- Run the Chainlit App:
poetry shell chainlit run app.py -w
-
Activate the Virtual Environment:
source myenv/bin/activate
-
Run the Chainlit App:
chainlit run app.py -w
- Once the application is running, open your browser and navigate to the displayed URL.
- Type your message and interact with the LLM model.
- The application maintains the conversation history, providing context-aware responses.
Contributions are welcome! Please open an issue or submit a pull request with your changes.
This project is licensed under the MIT License.