defenseunicorns / leapfrogai

Production-ready Generative AI for local, cloud native, airgap, and edge deployments.

Home Page:https://leapfrog.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Determine why VLLM will not startup with decorators, replace non-decorator deployment

CollectiveUnicorn opened this issue · comments

commented

Describe what should be investigated or refactored

When updating to confz for VLLM, an issue was identified where VLLM would not startup if the leapfrogai-api decorators were used (like @llm).

Once a root cause is identified, replace the current implementation with the decorators.

Links to any relevant code

#331

Additional context

This only occurs when doing a k8s deployment, the issue does not occur when using docker or running locally.