There are 0 repository under ollama-client topic.
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
LLMX; Easiest 3rd party Local LLM UI for the web!
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
A Fun project using Ollama, Streamlit & PyShark to chat with PCAP/PCAPNG files locally, privately!
Building a Chain of Thought RAG Model with DSPy, Qdrant and Ollama
An AI Toolbox for Simplified Access to AWS Bedrocks, Ollama from Rust
🤖 Discord bot for users to create and interact with locally hosted AI chat models. Powered by Ollama.
💬 Discord AI chatbot using Ollama with the new Ollama API
Create the prompts you need to write your Novel using AI
CrewAI Local LLM is a GitHub repository for a locally hosted large language model (LLM) designed to enable private, offline AI model usage and experimentation.
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
Ollama Chat is a GUI for Ollama designed for macOS.
A command line utility that queries websites for answers using a local LLM
Desktop UI for Ollama made with PyQT
ollama plugin for asdf version manager
C program for interacting with Ollama server from a Linux terminal
📝 You will be assigned a quest by LLM
macOS app for interacting with local LLMs, currently Ollama. Embeds a PyInstaller binary into an unsigned macOS app.
this repo demonstrates the ai capabilities over spring boot
A DBMS project with Streamlit Frontend for stock management simulation with Backtesting.
Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]