Matthews Correlation Coefficient Calculator
Enter confusion matrix counts below. The page calculates MCC and supporting diagnostic metrics for binary classification evaluation.
Example Data Table
| Scenario | TP | TN | FP | FN | Approx. MCC | Comment |
|---|---|---|---|---|---|---|
| Model A | 50 | 45 | 5 | 10 | 0.7303 | Strong balanced performance. |
| Model B | 30 | 60 | 20 | 15 | 0.4082 | Moderate correlation. |
| Model C | 80 | 10 | 40 | 5 | 0.2169 | High recall, weaker balance. |
| Model D | 25 | 25 | 25 | 25 | 0.0000 | No overall correlation. |
Formula Used
MCC = (TP × TN − FP × FN) / √[(TP + FP)(TP + FN)(TN + FP)(TN + FN)]
Meaning of the terms
- TP: predicted positive and actually positive.
- TN: predicted negative and actually negative.
- FP: predicted positive but actually negative.
- FN: predicted negative but actually positive.
Why MCC matters
MCC is often preferred when classes are imbalanced. It rewards balanced prediction quality and penalizes one-sided success that ordinary accuracy may hide.
How to Use This Calculator
- Enter a model name if you want labeled exports.
- Type the positive and negative class names.
- Provide confusion matrix counts for TP, TN, FP, and FN.
- Choose how many decimal places you want displayed.
- Click
Calculate MCCto generate results. - Review MCC, accuracy, precision, recall, and the interpretation note.
- Use the graph to compare metric levels quickly.
- Download the results as CSV or PDF when needed.
Frequently Asked Questions
1. What does MCC measure?
MCC measures the correlation between predicted and actual binary classes. It combines all four confusion matrix cells into one balanced statistic.
2. Why use MCC instead of accuracy alone?
Accuracy can look strong when one class dominates. MCC checks the whole confusion matrix and better reflects balanced predictive quality.
3. What is a good MCC value?
Values near 1 indicate strong agreement. Values near 0 suggest weak overall correlation. Negative values indicate inverse prediction behavior.
4. Can MCC handle imbalanced data?
Yes. MCC is popular for imbalanced classification because it does not rely only on overall correct predictions.
5. Why can MCC become undefined?
MCC becomes undefined when the denominator equals zero. That usually means one prediction or outcome margin has no variation.
6. Does this calculator also show supporting metrics?
Yes. It reports accuracy, precision, recall, specificity, F1 score, prevalence, balanced accuracy, and negative predictive value.
7. Are decimal confusion matrix values allowed?
No. Confusion matrix counts represent observed cases, so whole non-negative numbers are expected.
8. What should I export, CSV or PDF?
Use CSV for spreadsheets and further analysis. Use PDF for reports, meetings, and shareable summaries.