There are 14 repositories under ollama-webui topic.
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
Free, high-quality text-to-speech API endpoint to replace OpenAI, Azure, or ElevenLabs
Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
Self-host a ChatGPT-style web interface for Ollama 🦙
This repository provides resources and guidelines to facilitate the integration of Open-WebUI and Langfuse, enabling seamless monitoring and management of AI model usage statistics.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.
A Docker Compose to run a local ChatGPT-like application using Ollama, Ollama Web UI, Mistral NeMo & DeepSeek R1.
Ollama with Let's Encrypt Using Docker Compose
PuPu is a lightweight tool that makes it easy to run AI models on your own device. Designed for smooth performance and ease of use, PuPu is perfect for anyone who wants quick access to AI without technical complexity.
LLM AI Client based on Blazor. (openai, chatgpt, llama, ollama, onnx, deepseekr1...)
Persian Ollama Project: Enhance Persian (Farsi) prompts when chatting with Ollama LLMs.
A minimal interface in pure HTML/CSS for talking with Ollama focused on ensuring you can read the code.
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
This Docker Compose setup provides an isolated application with Ollama, Open-WebUI, and Nginx reverse proxy to enable secure HTTPS access. Since Open-WebUI does not support SSL natively, Nginx acts as a reverse proxy, handling SSL termination.
An advanced, multi-backend AI Agent Chat application built with React, FastAPI, and LangChain. Features a rich toolset including web search, code execution, and long-term memory.
Extremely simple chat interface for ollama models.
A minimal interface in pure HTML/CSS for talking with Ollama focused on ensuring you can read the code.
Web Client For Ollama - Llama LLM
ollama web_ui simple and easy
AI Chat UI is a responsive, modern chat interface, designed for seamless interactions with local Large Language Models (LLMs) like Ollama. It features a clean design, light/dark themes, PWA support for offline use, real-time chat with streaming responses, and local storage for chat history.
OLLAMA PUSHER SIMPLIFIES THE UPLOADING FROM GGUF BASED LLMS TO THE OLLAMA LIBARY! ONE TIME SETUP EVERY TIME READY TO GO!
Streamlined Ollama WebUI Setup: Automated Scripts, LLM Integration, and Desktop Shortcut Creation
AI model deployment on Synology NAS and macOS 🧠🐳
A simple web interface with Markdown support built using Flask, designed to interact with Ollama models.
A web interface for Ollama, providing a user-friendly way to interact with local language models.
Built using Open WebUI + Ollama + LLaMA 3
A modern, privacy-focused web interface for interacting with Ollama AI models. This application provides a sleek, user-friendly chat interface that runs entirely in your browser while connecting directly to your local Ollama installation.