Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

confusionMatrix: Precision == Positive Predictive Value and Recall == Sensitivity #1115

Open
msberends opened this issue Feb 5, 2020 · 1 comment

Comments

@msberends
Copy link
Contributor

msberends commented Feb 5, 2020

Hi, thanks for the great work on this package!

I noticed that the confusionMatrix() returns the Precision and the Positive Predictive Value, these are actually the same (and should be). Same goes for the Recall and the Sensitivity (True Positive Rate), those are identical as well. So I guess those values could be removed from the calculation.

On the other hand, I would like some additional derivations that are mentioned here: https://en.wikipedia.org/wiki/Confusion_matrix, like the False Negative Rate (FNR), the False Positive Rate (FPR), the False Discovery Rate (FDR) and the False Omission Rate (FOR). Could that be easily added? I could create a PR if you like.

@Hu-statistics
Copy link

Are the values of the sensitivity and the specificity in the outcome of confusionMatrix() reversed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants