lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design LLMs/AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Bedrock / Azure / Mistral / Perplexity ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT chat application.

Home Page:https://chat-preview.lobehub.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug] Ollama连通性检查失败

anrgct opened this issue · comments

💻 系统环境

macOS

📦 部署环境

Docker

🌐 浏览器

Chrome

🐛 问题描述

1715781855586
Ollama连通性检查失败,我通过配置docker中lobe访问主机的ollama服务,关闭了“使用客户端请求模式”,但是“连通性检查”仍然是通过浏览器直接发出的,虽然在聊天界面可以正常使用了,但是这个提示是错误的。

🚦 期望结果

No response

📷 复现步骤

No response

📝 补充信息

No response

👀 @anrgct

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


💻 System environment

macOS

📦 Deployment environment

Docker

🌐 Browser

Chrome

🐛 Problem description

1715781855586
The Ollama connectivity check failed. I accessed the host's ollama service by configuring lobe in docker and turned off "Use client request mode", but the "connectivity check" was still issued directly through the browser, although it could be used normally in the chat interface. , but this prompt is wrong.

🚦 Expected results

No response

📷 Steps to reproduce

No response

📝 Supplementary information

No response

启动 ollama 需要设置环境变量:
OLLAMA_ORIGINS=* ollama serve

OLLAMA_ORIGINS=* ollama run llama3

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Starting ollama requires setting environment variables:
OLLAMA_ORIGINS=* ollama serve
or
OLLAMA_ORIGINS=* ollama run llama3

https访问http有mixed-content错误,“使用客户端请求模式”并不控制“连通性检查”。我在Windows控制面板-系统属性-环境变量-用户环境变量新建变量名"OLLAMA_HOST"变量值"0.0.0.0",变量名"OLLAMA_ORIGINS"变量值"*",设置成功后,沉浸式翻译都接入ollama成功了,但是lobe-chat中“连通性检查”始终失败,因为我是https的本地域名启动的lobe-chat,这个状态下“使用客户端请求模式”关闭后可以正常聊天使用。

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


There is a mixed-content error when accessing http from https. "Use client request mode" does not control "connectivity check". I created a new variable name "OLLAMA_HOST" variable value "0.0.0.0" in the Windows Control Panel - System Properties - Environment Variables - User Environment Variables, and a variable name "OLLAMA_ORIGINS" variable value "*". After the settings are successful, Immersive Translation is connected. Ollama succeeded, but the "connectivity check" in lobe-chat always failed, because I started lobe-chat with the local domain name of https. In this state, after "Use client request mode" is turned off, the chat can be used normally.