psugihara / FreeChat

llama.cpp based AI chat app for macOS

Home Page:https://www.freechat.run

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Where is the server

hassanzadeh opened this issue · comments

Hey Guys,
Question, so how is the executable server created? Is it done by llama.cpp project?

Yes, the server is llama.cpp/examples/server running on localhost.

I make a universal binary by compiling it on an arm and intel machine then combining them with:
lipo -create server server_x86 -output freechat-server

Interesting when you say compile, you mean running "make -j +[arch related args]" in llama.cpp ?

yes but just "make"

Got it thanks,
This is a really interesting work, Congrats!