This project is a web-based chat application powered by the Ollama model. It's built using Flask, a lightweight WSGI web application framework in Python. The app is designed to handle conversations, maintain context, and integrate with external data sources for enriched interaction capabilities.
- Real-time chatting experience with contextual memory.
- New chat session initiation from the UI.
- Chat session history display on the sidebar.
- Integration with external data sources like databases and file systems for dynamic responses.
- Markdown rendering for rich text responses.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
What you need to install the software:
- Python 3.6+
- pip
- Virtual environment (recommended)
-
Clone the repository to your local machine:
git clone https://github.com/pritom007/ollama_chatbot.git
-
Navigate to the cloned directory:
cd ollama_chatbot
-
Create a virtual environment:
python -m venv venv
-
Activate the virtual environment:
- On Mac/Linux env:
source venv/bin/activate - On Windows: ```bash .\venv\Scripts\activate
-
Install the required packages:
pip install -r requirements.txt
-
Start the Flask application:
flask run
Or with Python directly:
python3 app.py
Open your web browser and navigate to http://127.0.0.1:5000/ to start chatting with the Ollama chatbot.
To initiate a new chat session, click the "New Chat" button.
Click on a chat session in the sidebar to view its history.
Please read CONTRIBUTING.md for details on our code of conduct, and the process for submitting pull requests to us.
Your Name - Initial work - YourUsername
See also the list of contributors who participated in this project.