getzep / zep

Zep: Long-Term Memory for ‍AI Assistants.

Home Page:https://docs.getzep.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Issue] getting fatal msg="invalid embeddings deployment for openai, deployment name is required"

axen2u opened this issue · comments

Describe the bug
I'm getting the following error. when I'm trying to run docker-compose.yml on the linux enviroment, In my local windows pc it's working. And I'm using local embeddings only. How can I fix that?

zep           | time="2023-12-17T09:58:17Z" level=warning msg=".env file not found or unable to load. This warning can be ignored if Zep is run using docker compose with env_file defined or you are passing ENV variables."
zep           | time="2023-12-17T09:58:17Z" level=info msg="Starting Zep server version 0.14.0-f646081 (2023-10-09T03:42:33+0000)"
zep           | time="2023-12-17T09:58:17Z" level=info msg="Log level set to: info"
zep           | time="2023-12-17T09:58:17Z" level=fatal msg="invalid embeddings deployment for openai, deployment name is required"

To Reproduce

My config.yml is like this.

llm:
  # openai or anthropic
  service: "openai"
  # OpenAI: gpt-3.5-turbo, gpt-4, gpt-3.5-turbo-1106, gpt-3.5-turbo-16k, gpt-4-32k; Anthropic: claude-instant-1 or claude-2
  model: "gpt-4-32k"
  ## OpenAI-specific settings
  # Only used for Azure OpenAI API
  azure_openai_endpoint: "https://myopenai.openai.azure.com/"
  # for Azure OpenAI API deployment, the model may be deployed with custom deployment names
  # set the deployment names if you encounter in logs HTTP 404 errors:
  #   "The API deployment for this resource does not exist."
  azure_openai:
  # llm.model name is used as deployment name as reasonable default if not set
  # assuming base model is deployed with deployment name matching model name
   # llm_deployment: "gpt-4-32k"
  # embeddings deployment is required when Zep is configured to use OpenAI embeddings
    #embedding_deployment: "text-embedding-ada-002"
  # Use only with an alternate OpenAI-compatible API endpoint
  openai_endpoint:
  openai_org_id:
nlp:
  server_url: "http://localhost:5557"
memory:
  message_window: 5
extractors:
  documents:
    embeddings:
      enabled: false
      chunk_size: 1000
      dimensions: 384
      service: "local"
      # dimensions: 1536
  messages:
    summarizer:
      enabled: true
      entities:
        enabled: true
      embeddings:
        enabled: true
        dimensions: 384
        service: "local"
    entities:
      enabled: true
    intent:
      enabled: true
    embeddings:
      enabled: true
      dimensions: 384
      service: "local"
#      dimensions: 1536
#      service: "openai"
store:
  type: "postgres"
  postgres:
    dsn: "postgres://postgres:postgres@localhost:5432/?sslmode=disable"
server:
  # Specify the host to listen on. Defaults to 0.0.0.0
  host: 0.0.0.0
  port: 8000
  # Is the Web UI enabled?
  # Warning: The Web UI is not secured by authentication and should not be enabled if
  # Zep is exposed to the public internet.
  web_enabled: true
  # The maximum size of a request body, in bytes. Defaults to 5MB.
  max_request_size: 5242880
auth:
  # Set to true to enable authentication
  required: false
  # Do not use this secret in production. The ZEP_AUTH_SECRET environment variable should be
  # set to a cryptographically secure secret. See the Zep docs for details.
  secret: "do-not-use-this-secret-in-production"
data:
  #  PurgeEvery is the period between hard deletes, in minutes.
  #  If set to 0 or undefined, hard deletes will not be performed.
  purge_every: 60
log:
  level: "info"
opentelemetry:
  enabled: false
# Custom Prompts Configuration
# Allows customization of extractor prompts.
custom_prompts:
  summarizer_prompts:
    # Anthropic Guidelines:
    # - Use XML-style tags like <current_summary> as element identifiers.
    # - Include {{.PrevSummary}} and {{.MessagesJoined}} as template variables.
    # - Clearly explain model instructions, e.g., "Review content inside <current_summary></current_summary> tags".
    # - Provide a clear example within the prompt.
    #
    # Example format:
    # anthropic: |
    #   <YOUR INSTRUCTIONS HERE>
    #   <example>
    #     <PROVIDE AN EXAMPLE>
    #   </example>
    #   <current_summary>{{.PrevSummary}}</current_summary>
    #   <new_lines>{{.MessagesJoined}}</new_lines>
    #   Response without preamble.
    #
    # If left empty, the default Anthropic summary prompt from zep/pkg/extractors/prompts.go will be used.
    anthropic: |

    # OpenAI summarizer prompt configuration.
    # Guidelines:
    # - Include {{.PrevSummary}} and {{.MessagesJoined}} as template variables.
    # - Provide a clear example within the prompt.
    #
    # Example format:
    # openai: |
    #   <YOUR INSTRUCTIONS HERE>
    #   Example:
    #     <PROVIDE AN EXAMPLE>
    #   Current summary: {{.PrevSummary}}
    #   New lines of conversation: {{.MessagesJoined}}
    #   New summary:`
    #
    # If left empty, the default OpenAI summary prompt from zep/pkg/extractors/prompts.go will be used.
    openai: |