There are 0 repository under ollama-chat topic.
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
A single-file tkinter-based Ollama GUI project with no external dependencies.
Simple html ollama chatbot that is easy to install. Simply copy the html file on your computer and run it.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
Ollama with Let's Encrypt Using Docker Compose
PuPu is a lightweight tool that makes it easy to run AI models on your own device. Designed for smooth performance and ease of use, PuPu is perfect for anyone who wants quick access to AI without technical complexity.
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy-first Ollama Chrome extension to chat with locally hosted Ollama lllm models like LLaMA 2, Mistral, and CodeLLaMA. Supports streaming, stop/regenerate and easy model switching — all without cloud APIs or data leaks.
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
Streamlit Chatbot using Ollama Open Source LLMs
AI model deployment on Synology NAS and macOS 🧠🐳
ollama client for android
"A simple and lightweight client-server program for interfacing with local LLMs using ollama, and LLMs in groq using groq api."
ollama gui desktop
A modern, feature-rich web interface built with Next.js and shadcn/ui for interacting with local Ollama large language models.
A web interface for Ollama, providing a user-friendly way to interact with local language models.
This a simple but functional chat UI for ollama. Can easily add it into any web app to add floating chat UI with ollama resposnse in your web application.
A lightweight local AI chatbot powered by Ollama and LLMs. Built using Python sockets and multi-threading to handle multiple users at once. Designed for simple, friendly English conversations with emoji-rich replies. 🌟
This desktop application, built with customtkinter, provides an interactive chat interface for local Large Language Models (LLMs) served via Ollama.
AI-powered multi-agent system built using the Google Agent Developer Toolkit, designed to streamline complex tasks across finance, web intelligence, and database interaction. This suite enables seamless orchestration between specialized agents, each with domain expertise, to collaboratively process and fulfill user intents in real-time.
Proyek ini memanfaatkan Ollama, sebuah platform ringan dan fleksibel untuk menjalankan Large Language Models (LLM) secara lokal, memungkinkan pengembangan dan eksperimen model bahasa secara efisien tanpa perlu koneksi cloud.
A feature-rich Ollama client with enhanced terminal UI using the Rich library
Simple SwiftUI app for chatting with Ollama backend
Ollama‑Chat is a Streamlit-based web UI for interacting with Ollama-hosted language models. It provides a clean, modern chat interface where users can: Choose from multiple local Ollama models Adjust parameters like temperature and system prompts View and preserve chat history Customize settings via a sidebar
DictAi helps you understand any word by giving both its dictionary definition and a simple, beginner-friendly explanation. It runs entirely on your device using Ollama’s fast phi model — no internet required.
AI Model for Competitive Exams A Flask-based web app that uses RAG (Retrieval-Augmented Generation) with ChromaDB and OCR to answer MPSC/UPSC questions. Users can input queries via text or images, and the system retrieves relevant content from study materials, applies NLP for context, and generates accurate, syllabus-aligned responses.
Django Chat App based on Ollama Models
Ollama负载均衡服务器 | 一款高性能、易配置的开源负载均衡服务器,优化Ollama负载。它能够帮助您提高应用程序的可用性和响应速度,同时确保系统资源的有效利用。
This project is a Python-based voice assistant that enables spoken conversations with a locally hosted LLM using Ollama.
Ollama - Telegram bot to send/receive messages with an LLM running locally on Ollama
Simple html ollama chatbot that is easy to install. Simply copy the html file on your computer and run it.
A modern, privacy-focused web interface for interacting with Ollama AI models. This application provides a sleek, user-friendly chat interface that runs entirely in your browser while connecting directly to your local Ollama installation.