langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.

Home Page:https://dify.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

dify local deployment using docker, can't set OPENAI provider

rackerxu opened this issue · comments

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.6.12

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

In llm provider,I tried to add OPENAI, but after I provide the key and endpoint, it shows "connection error". But it works when I use a jupyter notebook using requests to access the openai.
925ea4d683e0e32ca8adf58a0c27cdc
1e768f4fb5122d3a0a0831b86f367b4

✔️ Expected Behavior

Openai is added

❌ Actual Behavior

connection error

Hey @rackerxu! I'm here to assist you with any bugs, questions, or contributions you have for the Dify repository. Let's work together to resolve your issue.

To resolve the "connection error" when adding the OpenAI provider in Dify (version 0.6.12) deployed locally using Docker, ensure that the service URL is correctly configured. Replace localhost with host.docker.internal in the service URL. Here is the corrected configuration:

http://host.docker.internal:11434

This change will make the service accessible to the Docker container, resolving the connection error.

Additionally, if you encounter the error "Validation failed: You exceeded your current quota, please check your plan and billing details" when filling in the OpenAI key, it indicates that the OpenAI key account balance has been used up. Please top up the OpenAI account at openai.com. Refer to OpenAI for details on their plans and billing [1][2].

For integrating LocalAI with Dify, ensure that the Dify container can access the LocalAI endpoint. If Dify is deployed via Docker, use the host IP address instead of localhost for the LocalAI server URL. For example:

http://<your-LocalAI-endpoint-domain>:8080

This can be a LAN IP address, like:

http://192.168.1.100:8080

For more detailed steps on deploying LocalAI and integrating it with Dify, refer to the LocalAI integration guide [3].

To continue talking to Dosu, mention @dosu.

Please config a proper VPN to resolve this issue.