Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How did you evaluate on trec 2019 test #22

Closed
jordane95 opened this issue Jun 8, 2022 · 1 comment
Closed

How did you evaluate on trec 2019 test #22

jordane95 opened this issue Jun 8, 2022 · 1 comment

Comments

@jordane95
Copy link

Hi,

I can't find the instruction to replicate the nDCG performance on TREC 19.
Could you tell me how to run the evaluation on TREC 19 test set.

Thanks.

@jingtaozhan
Copy link
Owner

inference.py takes the input of 'mode' argument. Set it to 'test' and you can get the results of TREC 19.
To evaluate the NDCG metric, you need to use the trec_eval tool.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants