Precision Recall AUC Calculator

Explore thresholds using metrics, charts, and exports. Paste labels and scores for accurate model diagnostics. Make threshold decisions with confidence using visual performance summaries.

Calculator Input

Paste two columns: actual label and prediction score. A header row is optional.

White Theme Advanced Metrics Plotly Charts
Accepted delimiters: comma, tab, or semicolon. Any label that matches the positive class is treated as positive.
Example: 1, yes, fraud, churn.
Predictions at or beyond this threshold become positive.
Useful for risk ranks or distance-style scores.
Use 1 for F1, 2 to favor recall, 0.5 to favor precision.
Controls displayed precision in results and exports.

Example Data Table

This sample is already compatible with the calculator input format.

# Actual Label Predicted Score
110.98
210.93
300.90
410.88
500.84
610.80
710.76
800.72
910.69
1000.64
1110.61
1200.57

Formula Used

Confusion Matrix Terms

TP = true positives, FP = false positives, TN = true negatives, FN = false negatives.

Precision = TP / (TP + FP)

Recall = TP / (TP + FN)

Specificity = TN / (TN + FP)

Accuracy = (TP + TN) / (TP + FP + TN + FN)

F1 Score = 2 × Precision × Recall / (Precision + Recall)

F-Beta = (1 + β²) × Precision × Recall / ((β² × Precision) + Recall)

False Positive Rate = FP / (FP + TN)

True Positive Rate = Recall

Balanced Accuracy = (Recall + Specificity) / 2

MCC = (TP×TN − FP×FN) / √((TP+FP)(TP+FN)(TN+FP)(TN+FN))

ROC AUC is the trapezoidal area under the ROC curve.

PR AUC is the trapezoidal area under the precision-recall curve.

Average Precision sums precision across recall gains using an interpolated precision envelope.

How to Use This Calculator

  1. Paste your dataset with two columns: actual label and predicted score.
  2. Enter the label that should be treated as the positive class.
  3. Set the threshold that converts scores into predicted positives.
  4. Select whether higher or lower scores indicate stronger positive evidence.
  5. Choose the F-beta weight and display precision you prefer.
  6. Click Calculate Now to generate metrics, threshold analysis, and curves.
  7. Use the CSV or PDF buttons to export the report.
  8. Review the best threshold suggestions for F1 and Youden’s J.

FAQs

1) What does this calculator measure?

It measures threshold-based classification quality using precision, recall, F1, specificity, accuracy, MCC, ROC AUC, PR AUC, and average precision. It also shows charts and threshold-by-threshold performance.

2) When is PR AUC more useful than ROC AUC?

PR AUC is often more informative for highly imbalanced datasets because it focuses on positive class retrieval quality. ROC AUC can still look strong even when false positives are costly.

3) What should I paste into the input box?

Paste two columns per row: the actual class label first and the predicted score second. A header row is optional. Delimiters may be commas, tabs, or semicolons.

4) What is the difference between score and predicted label?

A score is a continuous value such as a probability, confidence, or rank. A predicted label is produced only after applying a threshold to that score.

5) Why do some metrics show N/A?

A metric becomes undefined when its denominator is zero. For example, precision is undefined when no positive predictions exist, and MCC is undefined when one confusion matrix margin is empty.

6) How is the best threshold chosen?

This page highlights the threshold with the highest F1 score and the threshold with the highest Youden’s J statistic. These are useful guides, not universal rules.

7) Can I use labels other than 0 and 1?

Yes. Any label that exactly matches the positive class field is treated as positive. All other labels are treated as negative for the calculations.

8) What do the export buttons include?

The CSV export includes summary metrics and the threshold analysis table. The PDF export includes a compact report with key metrics, confusion matrix, best thresholds, and threshold table.

Related Calculators

sensitivity specificity auccross validation aucauc from confusion matrix

Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.