-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Correct treatment of the unseen labels in the training set #290
Correct treatment of the unseen labels in the training set #290
Conversation
|
||
def test_all_new_labels_in_test(): | ||
""" | ||
Test classification with all labes in test set unseen |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"labes" -> "labels"
Looks pretty good. Just a trivial typo. |
model_params, | ||
grid_score) = res | ||
|
||
# check that the new metric is included into the results |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do you mean label instead of metric?
@aloukina you need to merge in |
It this ready, @aloukina? |
See my comment under #271 - we need to make sure that the labels in the confusion matrix are assigned correctly when the new labels fall between the old labels. Currently my branch generates confusion matrix with new labels at the end of the matrix. I have not looked at that part of the code yet to see whether we need to fix this here or there. |
I think that should be fixed in #271 after merging this one. Your PR by itself is ready right? |
yes |
@aoifecahill since you filed this issue, is there some kind of other data you can test on to make sure this is working as expected? |
Yep, will do. |
Works on my dataset 👍 |
…low-unseen-labels-in-the-test-set Correct treatment of the unseen labels in the training set
No description provided.