Evaluation Metrics - Confusion Matrix
What the confusion matrix shows
A confusion matrix compares:
- actual labels
- predicted labels
For binary classification:
- TP: true positives
- TN: true negatives
- FP: false positives
- FN: false negatives
false
flowchart TD A[Actual Positive] -->|Pred Positive| TP[TP] A -->|Pred Negative| FN[FN] B[Actual Negative] -->|Pred Negative| TN[TN] B -->|Pred Positive| FP[FP]
false
Why it matters
All important classification metrics come from these four numbers.
Scikit-learn example
Confusion matrix
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_true, y_pred)
print(cm)Confusion matrix
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_true, y_pred)
print(cm)Mini-checkpoint
If FN is very high, what does that mean?
- Youβre missing many real positives.
π§ͺ Try It Yourself
Exercise 1 β Train-Test Split
Exercise 2 β Fit a Linear Model
Exercise 3 β Evaluate with MSE
If this helped you, consider buying me a coffee β
Buy me a coffeeWas this page helpful?
Let us know how we did
