Matthews Correlation Coefficient Calculator

Evaluate binary classification quality using the Matthews Correlation Coefficient with comprehensive metrics

Binary Classification Confusion Matrix

Enter Confusion Matrix Values

Correctly predicted positive cases

Incorrectly predicted as positive

Incorrectly predicted as negative

Correctly predicted negative cases

Confusion Matrix

Predicted Positive
Predicted Negative
Actual Positive
0
TP
0
FN
Actual Negative
0
FP
0
TN
Total samples: 0

Matthews Correlation Coefficient Results

0.0000
MCC
N/A
Quality Rating
0.0%
Accuracy

Interpretation: No data provided

Formula: MCC = (TP×TN - FP×FN) / √[(TP+FP)(TP+FN)(TN+FP)(TN+FN)]

MCC Quality Scale

Inverse
< -0.2
No Value
-0.2 to 0.2
Poor
0.2 to 0.4
Fair
0.4 to 0.6
Good
0.6 to 0.8
Excellent
0.8 to 1.0

Example: Ceramic Factory Quality Control

Scenario

Problem: Quality control check of 100 ceramic plates

Prediction: 15 plates identified as defective

Reality: 25 plates were actually defective

Correct identifications: 10 out of 15 predictions

Confusion Matrix

Pred. Defective
Pred. Good
Act. Defective
TP: 10
FN: 15
Act. Good
FP: 5
TN: 70

Calculation

MCC = [(10×70) - (5×15)] / √[(10+5)(10+15)(70+5)(70+15)]

MCC = [700 - 75] / √[15×25×75×85]

MCC = 625 / √2,390,625

MCC = 0.4042 (Fair quality)

Sensitivity: 40% (only 40% of defects caught)

Classification Metrics

M

MCC

Overall classification quality

Best for imbalanced datasets

S

Sensitivity

True positive rate

How many positives caught

S

Specificity

True negative rate

How many negatives correct

P

Precision

Positive predictive value

Accuracy of positive predictions

MCC Tips

MCC considers all four confusion matrix quadrants

Best metric for imbalanced datasets

Range: -1 (worst) to +1 (perfect)

0 indicates random performance

Used in machine learning evaluation

Understanding Matthews Correlation Coefficient

What is MCC?

The Matthews Correlation Coefficient (MCC) is a balanced measure for evaluating binary classification quality, even with imbalanced datasets. Unlike accuracy, MCC considers all four quadrants of the confusion matrix and provides a more reliable assessment.

Why Use MCC?

  • Robust to class imbalance
  • Considers all classification outcomes
  • Interpretable scale (-1 to +1)
  • Widely used in bioinformatics and ML

Applications

Medical Diagnosis

Evaluating diagnostic test accuracy for diseases with different prevalence rates.

Machine Learning

Comparing classifier performance across different algorithms and datasets.

Quality Control

Assessing inspection systems in manufacturing and production environments.

Bioinformatics

Evaluating protein structure prediction and gene expression analysis.

Formula Components

MCC = (TP×TN - FP×FN) / √[(TP+FP)(TP+FN)(TN+FP)(TN+FN)]

TP
True Positive
Correctly identified positives
FP
False Positive
Incorrectly identified positives
FN
False Negative
Missed positive cases
TN
True Negative
Correctly identified negatives