AmenRa / ranx

⚡️A Blazing-Fast Python Library for Ranking Evaluation, Comparison, and Fusion 🐍

Home Page:https://amenra.github.io/ranx

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Feature Request] relevance_level parameter

AIIRLab opened this issue · comments

I was wondering if similar to trec_eval that we can specify the relevance_level, with -l parameter, this feature exists in ranx. If not that would be a useful feature for evaluation

Dear Behrooz,

Thanks again for your interest in ranx! :)

I added the requested feature in v0.3.3.
I added two different ways to set relevance levels.

Qrels-wise
Qrels now provides the set_relevance_level function: qrels.set_relevance_level(rel_lvl).
It changes the document relevance judgments stored in qrels.

Metric-wise
The relevance level can now be appended to metric names: map@100-l2, ndcg-l3.
This way, the original relevance judgments stored in qrels are preserved.

I checked some metric scores against trec_eval after setting the relevance level: they seem to work as intended.
I noticed that setting the relevance level for ndcg does not work on trec_eval.
I will open an issue on its repo to check whether this works as intended.

Let me know if you have further feature requests.

Best,

Elias