Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix retriever evaluation metrics #547

Merged
merged 3 commits into from
Nov 5, 2020
Merged

Fix retriever evaluation metrics #547

merged 3 commits into from
Nov 5, 2020

Conversation

bogdankostic
Copy link
Contributor

Fixes #536

This PR adds the retriever metric Mean Reciprocal Rank and fixes Mean Average Precision in the closed-domain case.
It might make sense to add open-domain option for finder evaluation in a future PR.

Copy link
Member

@tholor tholor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! As discussed in #536, we can also add mAP for open_domain=True in a separate PR.

@tholor tholor merged commit ffaa024 into master Nov 5, 2020
@julian-risch julian-risch deleted the retriever_eval_metrics branch November 15, 2021 07:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Fix retriever evaluation metrics
2 participants