Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A3 - same X shape but different values for confusion matrix #9

Closed
thirumalrao opened this issue Mar 18, 2017 · 2 comments
Closed

A3 - same X shape but different values for confusion matrix #9

thirumalrao opened this issue Mar 18, 2017 · 2 comments

Comments

@thirumalrao
Copy link

I have got the same values for train and test data, but confusion matrix is slightly deviating from what's given in Log.txt. Has anyone else faced this... I wonder where and what could have gone wrong. Any pointers will be appreciated.

training data shape: (27867, 18290)

testing data shape: (28033, 18290)

confusion matrix:
Predicted I-LOC I-MISC I-ORG I-PER O
Actual
I-LOC 861 13 58 131 87
I-MISC 54 334 43 40 98
I-ORG 155 21 405 261 173
I-PER 66 10 42 1304 134
O 58 15 33 97 23540

evaluation matrix:
I-LOC I-MISC I-ORG I-PER O
precision 0.721106 0.849873 0.697074 0.711402 0.979527
recall 0.748696 0.586995 0.399015 0.838046 0.991450
f1 0.734642 0.694387 0.507519 0.769549 0.985453

@LarryZhao0616
Copy link

I think there must be some small mistake in your confusion() and evaluate() function.

If you don't like to post your code public, you could e-mail it to me. szhao31 AT hawk.iit.edu.

Regards
Sihan

@thirumalrao
Copy link
Author

Tried multiple implementations for confusion() All of them are giving me the same values as posted above.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants