ollama / ollama

Get up and running with Llama 3.2, Mistral, Gemma 2, and other large language models.

Home Page:https://ollama.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

use"docker logs -f ragflow-server"

liukx362330 opened this issue · comments

What is the issue?

when i use "docker logs -f ragflow-server",the project need network,or it will hint
"equests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443):
Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f4dd75afac0>,
'Connection to openaipublic.blob.core.windows.net timed out. (connect timeout=None)'))"
i want user the project without network,can you help me ?thank you!

OS

No response

GPU

No response

CPU

No response

Ollama version

No response