There are 1 repository under localllama topic.
✨ Kubectl plugin to create manifests with LLMs
The easiest way to use the Ollama API in .NET
[NeurIPS 2024] KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
AubAI brings you on-device gen-AI capabilities, including offline text generation and more, directly within your app.
Run gguf LLM models in Latest Version TextGen-webui
Full featured demo application for OllamaSharp
A tool allows to customize and enhance the functionality of local copilot editor plugins.
Summarize emails received by Thunderbird mail client extension via locally run LLM. Early development.
Local AI Search assistant web or CLI for ollama and llama.cpp. Lightweight and easy to run, providing a Perplexity-like experience.