You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that the confusionMatrix() returns the Precision and the Positive Predictive Value, these are actually the same (and should be). Same goes for the Recall and the Sensitivity (True Positive Rate), those are identical as well. So I guess those values could be removed from the calculation.
On the other hand, I would like some additional derivations that are mentioned here: https://en.wikipedia.org/wiki/Confusion_matrix, like the False Negative Rate (FNR), the False Positive Rate (FPR), the False Discovery Rate (FDR) and the False Omission Rate (FOR). Could that be easily added? I could create a PR if you like.
The text was updated successfully, but these errors were encountered:
Hi, thanks for the great work on this package!
I noticed that the
confusionMatrix()
returns the Precision and the Positive Predictive Value, these are actually the same (and should be). Same goes for the Recall and the Sensitivity (True Positive Rate), those are identical as well. So I guess those values could be removed from the calculation.On the other hand, I would like some additional derivations that are mentioned here: https://en.wikipedia.org/wiki/Confusion_matrix, like the False Negative Rate (FNR), the False Positive Rate (FPR), the False Discovery Rate (FDR) and the False Omission Rate (FOR). Could that be easily added? I could create a PR if you like.
The text was updated successfully, but these errors were encountered: