Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is any plan to release the label for the test dataset? #3

Closed
YanLiang1102 opened this issue Dec 20, 2020 · 2 comments
Closed

Is any plan to release the label for the test dataset? #3

YanLiang1102 opened this issue Dec 20, 2020 · 2 comments

Comments

@YanLiang1102
Copy link

Or the plan is to always submit to the leaderboard?

@Bakser
Copy link
Member

Bakser commented Apr 22, 2021

We do not have the plan to release the test labels to avoid overfitting on test set. You can tune hyperparameters on the released dev set and submit the predictions to the CodaLab leaderboard.

@Bakser Bakser closed this as completed Apr 22, 2021
@zero0kiriyu
Copy link

Hi, I am now trying to implement an incremental learning algorithm on this dataset. However, it is impossible to evaluate the performance on a certain part of the test set without labels. It would be very helpful if you can release the label of the test set.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants