vovw / kunzite

local, private, gui for running your inference engine of choice

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Kunzite

Kunzite is a lightweight, bloat-free GUI application for local AI chat using the lugia inference engine (based on mlx).

Kunzite GUI

Features

  • Local AI: Run AI chat completions entirely on your device
  • MLX Inference Engine: Utilize Apple's MLX framework for fast, efficient inference
  • On-Device Inference: Ensure privacy and offline functionality
  • Bloat-Free GUI: Minimalist interface for distraction-free work

Quick Start

  1. Clone the repository
  2. Install dependencies - imgui, lugia
  3. Run make to build the project
  4. Execute ./kunzite to start the application

Architecture Diagram

  • Kunzite uses a simple client-server architecture
  • GUI: Implemented with Dear ImGui for a lightweight, cross-platform interface
  • LLM Client: Handles communication with the local inference server
  • MLX Server: Runs the language model using the MLX framework using lugia inference engine

License

This project is licensed under the MIT License

About

local, private, gui for running your inference engine of choice


Languages

Language:C++ 99.8%Language:Makefile 0.2%Language:Shell 0.1%Language:C 0.0%