An error occurs in llm-rubric if there is '\n' in the custom provider's output.
grayhacker91 opened this issue · comments
If there is '\n' in the output of the custom python provider, an error occurs in llm-rubric.
Error: Chat Completion prompt is not a valid JSON string: SyntaxError: Unexpected token in JSON at position
Is there anything else that needs to be processed in the output of the python provider?
The problem occurs when the testcase has an output property. Solved.