Skip to content
This repository has been archived by the owner on Apr 11, 2023. It is now read-only.

RELEVANCE_ANNOTATIONS_CSV_PATH file for running evaluation #98

Closed
scorpionhiccup opened this issue Dec 12, 2019 · 3 comments
Closed

RELEVANCE_ANNOTATIONS_CSV_PATH file for running evaluation #98

scorpionhiccup opened this issue Dec 12, 2019 · 3 comments

Comments

@scorpionhiccup
Copy link

Hi,

I wanted to run an evaluation using the NDCG score as done in the paper.

Where is the RELEVANCE_ANNOTATIONS_CSV_PATH for the 99 queries as mentioned in the README to run the /src/relevanceeval.py file?

Just want to test my results..

@mallamanis
Copy link
Contributor

Hi @scorpionhiccup

The file is hidden behind the leaderboard. Once you submit the relevant evaluation is run. We don't release the annotation file yet, to avoid the case where people overfit on the small testset. You're more than welcome to submit to the leaderboard if you'd like to see the results of your model. We do accept multiple submissions (but no more than 1 submission every two weeks, see #80 ) For more fine-grained testing, we recommend that you use the proxy task.

@celsofranssa
Copy link
Contributor

Hi @mallamanis,

What about publishing the relevance judgement?

@mallamanis
Copy link
Contributor

Done. See main README

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants