Matthews Correlation Coefficient Calculation

Measure binary classification quality beyond simple accuracy metrics. Enter counts and inspect performance trends visually. Download reports, visualize MCC, and evaluate model stability clearly.

Matthews Correlation Coefficient Calculator

Enter confusion matrix counts below. The page calculates MCC and supporting diagnostic metrics for binary classification evaluation.

Reset

Example Data Table

Scenario TP TN FP FN Approx. MCC Comment
Model A 50 45 5 10 0.7303 Strong balanced performance.
Model B 30 60 20 15 0.4082 Moderate correlation.
Model C 80 10 40 5 0.2169 High recall, weaker balance.
Model D 25 25 25 25 0.0000 No overall correlation.

Formula Used

Matthews Correlation Coefficient

MCC = (TP × TN − FP × FN) / √[(TP + FP)(TP + FN)(TN + FP)(TN + FN)]

Meaning of the terms

  • TP: predicted positive and actually positive.
  • TN: predicted negative and actually negative.
  • FP: predicted positive but actually negative.
  • FN: predicted negative but actually positive.

Why MCC matters

MCC is often preferred when classes are imbalanced. It rewards balanced prediction quality and penalizes one-sided success that ordinary accuracy may hide.

How to Use This Calculator

  1. Enter a model name if you want labeled exports.
  2. Type the positive and negative class names.
  3. Provide confusion matrix counts for TP, TN, FP, and FN.
  4. Choose how many decimal places you want displayed.
  5. Click Calculate MCC to generate results.
  6. Review MCC, accuracy, precision, recall, and the interpretation note.
  7. Use the graph to compare metric levels quickly.
  8. Download the results as CSV or PDF when needed.

Frequently Asked Questions

1. What does MCC measure?

MCC measures the correlation between predicted and actual binary classes. It combines all four confusion matrix cells into one balanced statistic.

2. Why use MCC instead of accuracy alone?

Accuracy can look strong when one class dominates. MCC checks the whole confusion matrix and better reflects balanced predictive quality.

3. What is a good MCC value?

Values near 1 indicate strong agreement. Values near 0 suggest weak overall correlation. Negative values indicate inverse prediction behavior.

4. Can MCC handle imbalanced data?

Yes. MCC is popular for imbalanced classification because it does not rely only on overall correct predictions.

5. Why can MCC become undefined?

MCC becomes undefined when the denominator equals zero. That usually means one prediction or outcome margin has no variation.

6. Does this calculator also show supporting metrics?

Yes. It reports accuracy, precision, recall, specificity, F1 score, prevalence, balanced accuracy, and negative predictive value.

7. Are decimal confusion matrix values allowed?

No. Confusion matrix counts represent observed cases, so whole non-negative numbers are expected.

8. What should I export, CSV or PDF?

Use CSV for spreadsheets and further analysis. Use PDF for reports, meetings, and shareable summaries.

Related Calculators

data normalization toolrandom forest predictorbinary classification toolprobability score calculatorlearning curve toolxgboost predictorneural network predictorbrier score calculatorthe matthews correlation coefficient

Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.