richardanaya / htmx_llamacpp_server

Fun little project that makes a llama.cpp server LLM chat interface using HTMX and Rust

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

HTMX + Llama.cpp Server ❤️

On machine with llama.cpp

.\llama-server -ngl 100 --port 9090 -m <some.gguf> --host 0.0.0.0

When running

cargo run -- --llama http://<llama.cpp_server_IP>:9090
Screenshot 2024-06-30 at 8 03 39 AM

About

Fun little project that makes a llama.cpp server LLM chat interface using HTMX and Rust

License:MIT License


Languages

Language:CSS 46.9%Language:Rust 34.6%Language:Jinja 11.4%Language:HTML 7.1%