Confusion Matrix Calculator
Analyze classification model performance with comprehensive metrics and confusion matrix analysis
Confusion Matrix Input
Correct predictions labeled as positive
Wrong predictions labeled as positive (Type I error)
Correct predictions labeled as negative
Wrong predictions labeled as negative (Type II error)
Confusion Matrix Visualization
Total Samples: 200
Actual Positive: 150 | Actual Negative: 50
Predicted Positive: 100 | Predicted Negative: 100
Performance Metrics
Detailed Performance Metrics
Example: Email Spam Classification
Scenario
Dataset: 200 emails classified as spam or not spam
True Positive (80): Emails correctly identified as spam
False Positive (20): Non-spam emails incorrectly labeled as spam
True Negative (30): Non-spam emails correctly identified
False Negative (70): Spam emails incorrectly labeled as non-spam
Interpretation
Accuracy (55%): Overall correctness of the model
Precision (80%): When model predicts spam, it's correct 80% of the time
Recall (53%): Model catches 53% of all actual spam emails
F1 Score (0.64): Balanced measure between precision and recall
Confusion Matrix Guide
True Positive (TP)
Correctly predicted positive cases
False Positive (FP)
Incorrectly predicted as positive (Type I error)
True Negative (TN)
Correctly predicted negative cases
False Negative (FN)
Incorrectly predicted as negative (Type II error)
Metric Interpretations
Accuracy
Overall model correctness
Precision
Quality of positive predictions
Recall (Sensitivity)
Ability to find all positive cases
F1 Score
Balance between precision and recall
Specificity
True negative rate
Understanding Confusion Matrix
What is a Confusion Matrix?
A confusion matrix is a table used to evaluate the performance of a classification model. It provides a detailed breakdown of correct and incorrect predictions for each class, enabling comprehensive analysis of model performance beyond simple accuracy.
Key Applications
- •Machine learning model evaluation
- •Medical diagnosis system assessment
- •Quality control and defect detection
- •Fraud detection system evaluation
Performance Metrics Formulas
When to Use Each Metric
- Precision: When false positives are costly
- Recall: When false negatives are costly
- F1 Score: When you need balance between precision and recall
- Accuracy: When classes are balanced