Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

evaluate crashes if there are labels in test that don't appear in train #279

Closed
aoifecahill opened this issue Feb 5, 2016 · 1 comment
Closed
Assignees
Milestone

Comments

@aoifecahill
Copy link
Collaborator

Since the labels in the test set are fixed, I think it should be possible when computing the scores and generating the confusion matrix to add the unseen labels in the test set to the label_dict.

@desilinguist desilinguist added this to the 1.2 milestone Feb 8, 2016
@desilinguist
Copy link
Member

Addressed by #290.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants