Confusion Matrix Calculator

Analyze classification model performance with comprehensive metrics and confusion matrix analysis

Confusion Matrix Input

Correct predictions labeled as positive

Wrong predictions labeled as positive (Type I error)

Correct predictions labeled as negative

Wrong predictions labeled as negative (Type II error)

Confusion Matrix Visualization

Predicted Positive
Predicted Negative
Actual Positive
TP
80
FN
70
Actual Negative
FP
20
TN
30

Total Samples: 200

Actual Positive: 150 | Actual Negative: 50

Predicted Positive: 100 | Predicted Negative: 100

Performance Metrics

55.00%
Accuracy
(TP+TN)/Total
80.00%
Precision
TP/(TP+FP)
53.33%
Recall
TP/(TP+FN)
0.6400
F1 Score
Harmonic Mean

Detailed Performance Metrics

True Positive Rate (TPR/Sensitivity):53.33%
False Negative Rate (FNR):46.67%
False Positive Rate (FPR):40.00%
True Negative Rate (TNR/Specificity):60.00%
False Discovery Rate (FDR):20.00%
Matthews Correlation Coefficient (MCC):0.1155
Total Samples:200
Correct Predictions:110

Example: Email Spam Classification

Scenario

Dataset: 200 emails classified as spam or not spam

True Positive (80): Emails correctly identified as spam

False Positive (20): Non-spam emails incorrectly labeled as spam

True Negative (30): Non-spam emails correctly identified

False Negative (70): Spam emails incorrectly labeled as non-spam

Interpretation

Accuracy (55%): Overall correctness of the model

Precision (80%): When model predicts spam, it's correct 80% of the time

Recall (53%): Model catches 53% of all actual spam emails

F1 Score (0.64): Balanced measure between precision and recall

Confusion Matrix Guide

True Positive (TP)

Correctly predicted positive cases

False Positive (FP)

Incorrectly predicted as positive (Type I error)

True Negative (TN)

Correctly predicted negative cases

False Negative (FN)

Incorrectly predicted as negative (Type II error)

Metric Interpretations

Accuracy

Overall model correctness

Precision

Quality of positive predictions

Recall (Sensitivity)

Ability to find all positive cases

F1 Score

Balance between precision and recall

Specificity

True negative rate

Understanding Confusion Matrix

What is a Confusion Matrix?

A confusion matrix is a table used to evaluate the performance of a classification model. It provides a detailed breakdown of correct and incorrect predictions for each class, enabling comprehensive analysis of model performance beyond simple accuracy.

Key Applications

  • Machine learning model evaluation
  • Medical diagnosis system assessment
  • Quality control and defect detection
  • Fraud detection system evaluation

Performance Metrics Formulas

Accuracy: (TP + TN) / (TP + TN + FP + FN)
Precision: TP / (TP + FP)
Recall: TP / (TP + FN)
F1 Score: 2 × (Precision × Recall) / (Precision + Recall)
Specificity: TN / (TN + FP)
MCC: (TP×TN - FP×FN) / √((TP+FP)(TN+FN)(FP+TN)(TP+FN))

When to Use Each Metric

  • Precision: When false positives are costly
  • Recall: When false negatives are costly
  • F1 Score: When you need balance between precision and recall
  • Accuracy: When classes are balanced