LLM_GUARD_API not working
knowitall12 opened this issue · comments
Describe the bug
When deploying llm_guard_api we are unable to use following endpoints: /docs, analyze/prompt, analyze/output
To Reproduce
Steps to reproduce the behavior:
- Deploy the llm_guard_api in docker container
- try using localhost/docs
Expected behavior
API should give us the results of scanners and swagger ui.
Additional context
Add any other context about the problem here.
Hey @knowitall12 , please remove auth
from the config file, or you can pass AUTH_TOKEN
environment variable, which you can use in Authorization: Bearer TOKEN
header