Clean up llamacpp-chat interface
thomasantony opened this issue · comments
Thomas Antony commented
Make it look as nice as the llama.cpp
interface, with terminal colors and everything. The ideal version should be runnable as just llamacpp-chat -m <path-to-model>
.
MagicSource commented
@thomasantony Does chat work now?