Unable to connect to the llama3:70b model on the remote server (e.g., 192.168.0.26)
zzeddo opened this issue · comments
PARK JAEDO commented
In the ‘aiFeatures.js’ script, Ollama was set to only fetch models from the local environment (127.0.0.1).
airesponse = await axiosPostWrap('http://127.0.0.1:11434/api/chat', { model: OLLAMA_MODEL, stream: false, options, messages });
return await axios.get('http://127.0.0.1:11434/api/tags');
It would be good if we could specify the IP of the remote server for security reasons.