There are 0 repository under mlx-lm topic.
AI Copilot for Vim/NeoVim
A high-performance API server that provides OpenAI-compatible endpoints for MLX models. Developed using Python and powered by the FastAPI framework, it provides an efficient, scalable, and user-friendly solution for running MLX-based vision and language models locally with an OpenAI-compatible interface.
Unified management and routing for llama.cpp, MLX and vLLM models with web dashboard.
Build an Autonomous Web3 AI Trading Agent (BASE + Uniswap V4 example)
Experimental: MLX model provider for Strands Agents - Build, train, and deploy AI agents on Apple Silicon.
An all-in-one LLMs chat Web UI based on the MLX framework, designed for Apple Silicon.
Various LLM resources and experiments
Federated Fine-Tuning of LLMs on Apple Silicon with Flower.ai and MLX-LM
Add MLX support to Pydantic AI through LM Studio or mlx-lm, run MLX compatible HF models on Apple silicon.
Reinforcement learning for text generation on MLX (Apple Silicon)
A comprehensive toolkit for end-to-end continued pre-training, fine-tuning, monitoring, testing and publishing of language models with MLX-LM
📄 Generate and fine-tune large language models on Apple silicon effortlessly with MLX LM, integrating seamlessly with the Hugging Face Hub.
MLX inference service compatible with OpenAI API, built on MLX-LM and MLX-VLM.基于MLX-LM和MLX-VLM构建的OpenAI API兼容的MLX推理服务.
LLM model inference on Apple Silicon Mac using the Apple MLX Framework.
OFX File Creator is a compact Python library/CLI that converts CSV/Excel bank exports into valid OFX statements. It normalizes vendor columns, parses dates and amounts, infers TRNTYPE via configurable YAML/JSON rules (optional mlx-lm enrichment), and includes examples, tests, and GitHub Actions CI.