psugihara / FreeChat

llama.cpp based AI chat app for macOS

Home Page:https://www.freechat.run

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Chat locally with REST API

greenido opened this issue · comments

It would be great to be able to chat/query this with REST API so we can bind it into other projects.

Absolutely, thanks for making an issue. Is your REST API llama.cpp/examples/server or something else? I made #28 because a few people wanted that.

Ahh... yes please!
Something like the server example would be awesome.

Cool! Also, maybe I misunderstood your original question. Right now you can hit the llama.cpp/examples/server running on localhost:8690 if you're chatting with FreeChat (at some point I want to dynamically bind that port because FreeChat will break if it's taken). The server includes a tiny html front-end as well as a streaming completion API https://github.com/ggerganov/llama.cpp/tree/master/examples/server

Architecturally, FreeChat is basically a 1-click runner and alternate front-end for server.cpp.