protectai / llm-guard

The Security Toolkit for LLM Interactions

Home Page:https://llm-guard.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

LLM_GUARD_API not working

knowitall12 opened this issue · comments

Describe the bug
When deploying llm_guard_api we are unable to use following endpoints: /docs, analyze/prompt, analyze/output

To Reproduce
Steps to reproduce the behavior:

  1. Deploy the llm_guard_api in docker container
  2. try using localhost/docs

Expected behavior
API should give us the results of scanners and swagger ui.

Screenshots
image

Additional context
Add any other context about the problem here.

Hey @knowitall12 , please remove auth from the config file, or you can pass AUTH_TOKEN environment variable, which you can use in Authorization: Bearer TOKEN header