ad-si / cai

The fastest CLI tool for prompting LLMs. Including support for prompting several LLMs at once!

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Local provider provider URL not configurable / doesn't use standard Ollama URL/port

sammcj opened this issue · comments

commented

It seems odd that cai tries to use Ollama on a non-standard port (8080).

I think it should use or at least default to the standard Ollama URL of http://127.0.0.1:11343 - this would allow people to install cai and just start using it without having to customise Ollama or or other compatible tooling that uses Ollama default port 11343.

cai local test
ERROR:
error sending request for url (http://localhost:8080/v1/chat/completions)

ollama serve
listen tcp 127.0.0.1:11434
commented

PR submitted #5

Thanks for the heads up and the PR!
I implemented it a little differently, but I hope you like it? -> a3aac65

TLDR:

cargo install cai
cai ollama llama2 How heigh is the Eiffel Tower in meters
commented

That works! Thanks a bunch :)

$ cai ollama tinydolphin:1.1b-v2.8-q5_K_M 'tell me a joke'

🧠 Model: Ollama's tinydolphin:1.1b-v2.8-q5_K_M
⏱️ Duration: 706 ms


 Why is there a hole in the ground? Because if you drop a penny and it goes
through, it'll take us to the moon.