Skip to content

Evaluation Metrics - Confusion Matrix

What the confusion matrix shows

A confusion matrix compares:

  • actual labels
  • predicted labels

For binary classification:

  • TP: true positives
  • TN: true negatives
  • FP: false positives
  • FN: false negatives

false


  flowchart TD
  A[Actual Positive] -->|Pred Positive| TP[TP]
  A -->|Pred Negative| FN[FN]
  B[Actual Negative] -->|Pred Negative| TN[TN]
  B -->|Pred Positive| FP[FP]

false

Why it matters

All important classification metrics come from these four numbers.

Scikit-learn example

Confusion matrix
from sklearn.metrics import confusion_matrix
 
cm = confusion_matrix(y_true, y_pred)
print(cm)
Confusion matrix
from sklearn.metrics import confusion_matrix
 
cm = confusion_matrix(y_true, y_pred)
print(cm)

Mini-checkpoint

If FN is very high, what does that mean?

  • You’re missing many real positives.

πŸ§ͺ Try It Yourself

Exercise 1 – Train-Test Split

Exercise 2 – Fit a Linear Model

Exercise 3 – Evaluate with MSE

If this helped you, consider buying me a coffee β˜•

Buy me a coffee

Was this page helpful?

Let us know how we did