wuxmax's starred repositories
open-interpreter
A natural language interface for computers
ColossalAI
Making large AI models cheaper, faster and more accessible
open-webui
User-friendly WebUI for LLMs (Formerly Ollama WebUI)
instructor
structured outputs for llms
inference
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
guardrails
Adding guardrails to large language models.
ollama_proxy_server
A proxy server for multiple ollama instances with Key security
lm-inference-engines
Comparison of Language Model Inference Engines