promptfoo / promptfoo

Test your prompts, agents, and RAGs. Redteaming, pentesting, vulnerability scanning for LLMs. Improve your app's quality and catch problems. Compare performance of GPT, Claude, Gemini, Llama, and more. Simple declarative configs with command line and CI/CD integration.

Home Page:https://www.promptfoo.dev/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

An error occurs in llm-rubric if there is '\n' in the custom provider's output.

grayhacker91 opened this issue · comments

If there is '\n' in the output of the custom python provider, an error occurs in llm-rubric.

Error: Chat Completion prompt is not a valid JSON string: SyntaxError: Unexpected token in JSON at position

Is there anything else that needs to be processed in the output of the python provider?

The problem occurs when the testcase has an output property. Solved.