There are 2 repositories under ollama-ui topic.
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
Kangaroo is an AI-powered SQL client and admin tool for popular databases (MariaDB / MySQL / Oracle / PostgreSQL / Redis / SQLite / SQL Server / ...) on Windows / MacOS / Linux, support table design, query, model, sync, export/import etc, focus on comfortable, fun and developer friendly.
Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
LLMX; Easiest 3rd party Local LLM UI for the web!
A single-file tkinter-based Ollama GUI project with no external dependencies.
Odin Runes, a java-based GPT client, facilitates interaction with your preferred GPT model right through your favorite text editor. There is more: It also facilitates prompt-engineering by extracting context from diverse sources using technologies such as OCR, enhancing overall productivity and saving costs.
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
Simple html ollama chatbot that is easy to install. Simply copy the html file on your computer and run it.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.
Ollama with Let's Encrypt Using Docker Compose
PuPu is a lightweight tool that makes it easy to run AI models on your own device. Designed for smooth performance and ease of use, PuPu is perfect for anyone who wants quick access to AI without technical complexity.
🚀 A powerful Flutter-based AI chat application that lets you run LLMs directly on your mobile device or connect to local model servers. Features offline model execution, Ollama/LLMStudio integration, and a beautiful modern UI. Privacy-focused, cross-platform, and fully open source.
Full featured demo application for OllamaSharp
Transform your writing with TextLLaMA! ✍️🚀 Simplify grammar, translate effortlessly, and compose emails like a pro. 🌍📧
HTML UI for Ollama. Minimal & responsive UI: mobile & desktop. Cross-browser support. Simple installation: host on your own server, run in your browser.
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy-first Ollama Chrome extension to chat with locally hosted Ollama lllm models like LLaMA 2, Mistral, and CodeLLaMA. Supports streaming, stop/regenerate and easy model switching — all without cloud APIs or data leaks.
A Chrome extension hosts an Ollama UI web server on localhost and other servers, helping you manage models and chat with any open-source model. 🚀💻✨
ollama client for android
A Chrome extension hosts an Ollama UI web server on localhost and other servers, helping you manage models and chat with any open-source model. 🚀💻✨
This Docker Compose setup provides an isolated application with Ollama, Open-WebUI, and Nginx reverse proxy to enable secure HTTPS access. Since Open-WebUI does not support SSL natively, Nginx acts as a reverse proxy, handling SSL termination.
Frontend for the Ollama LLM, built with React.js and Flux architecture.
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
Transform your writing with TextLLaMA! ✍️🚀 Simplify grammar, translate effortlessly, and compose emails like a pro. 🌍📧
AI Chat UI is a responsive, modern chat interface, designed for seamless interactions with local Large Language Models (LLMs) like Ollama. It features a clean design, light/dark themes, PWA support for offline use, real-time chat with streaming responses, and local storage for chat history.
A minimal interface in pure HTML/CSS for talking with Ollama focused on ensuring you can read the code.
UI for Ollama built in Java with Swing and Ollama4j
ollama web_ui simple and easy
User-friendly LLM interface, self-hosted, offline, and privacy-first.