citadel-ai / langcheck

Simple, Pythonic building blocks to evaluate LLM applications.

Home Page:https://langcheck.readthedocs.io/en/latest/index.html

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add "refusal to answer" metric

kennysong opened this issue · comments

Quite similar to the ai_disclaimer_similarity metric, but identifying LLM outputs like "I don't have enough information" or "I don't know".

@liwii have you started working on this yet? If not, I can take it

Oh missed the comment.

Yeah it would be very helpful, thanks!!