kstost / aiexe

aiexe, the cutting-edge command-line interface (CLI/GUI) tool

Home Page:https://youtube.com/@코드깎는노인

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unable to connect to the llama3:70b model on the remote server (e.g., 192.168.0.26)

zzeddo opened this issue · comments

In the ‘aiFeatures.js’ script, Ollama was set to only fetch models from the local environment (127.0.0.1).
airesponse = await axiosPostWrap('http://127.0.0.1:11434/api/chat', { model: OLLAMA_MODEL, stream: false, options, messages });
return await axios.get('http://127.0.0.1:11434/api/tags');

It would be good if we could specify the IP of the remote server for security reasons.