Skip to content

Latest commit

 

History

History
76 lines (57 loc) · 2.42 KB

03-confusion-table.md

File metadata and controls

76 lines (57 loc) · 2.42 KB

4.3 Confusion table

Slides

Notes

Confusion table is a way of measuring different types of errors and correct decisions that binary classifiers can make. Considering this information, it is possible to evaluate the quality of the model by different strategies.

When comes to a prediction of an LR model, each falls into one of four different categories:

  • Prediction is that the customer WILL churn. This is known as the Positive class
    • And Customer actually churned - Known as a True Positive (TP)
    • But Customer actually did not churn - Known as a False Positive (FP)
  • Prediction is that the customer WILL NOT churn' - This is known as the Negative class
    • Customer did not churn - True Negative (TN)
    • Customer churned - False Negative (FN)

'Confusion Table' is a way to summarize the above results in a tabular format, as shown below:

Predictions
Actual Negative Positive
Negative TN FP
Positive FN TP

confusion_matrix.png

The accuracy corresponds to the sum of TN and TP divided by the total of observations.

The code of this project is available in this jupyter notebook.

Add notes from the video (PRs are welcome)

⚠️ The notes are written by the community.
If you see an error here, please create a PR with a fix.

Navigation