Regression Accuracy Tool Calculator

Audit predictions with clear metrics and residual checks. Test model quality, bias, and stability quickly. Turn raw outputs into reliable insights for better decisions.

Enter Model Outputs

Use commas, spaces, semicolons, or new lines.
Keep the same number of values as actual data.
Used for adjusted R² and residual standard error.
Choose how many decimals to display in results.
This calculator supports MAE, MSE, RMSE, MAPE, sMAPE, R², adjusted R², explained variance, correlation, residual error, and Durbin-Watson.

Example Data Table

Observation Actual Predicted Residual
11201182
2135138-3
31421402
41581553
5165168-3
61801764

You can paste these values into the calculator to test the workflow quickly.

Formula Used

Residual (eᵢ) = Actualᵢ − Predictedᵢ MAE = Σ|eᵢ| / n MSE = Σ(eᵢ²) / n RMSE = √MSE MAPE = [Σ(|eᵢ| / |Actualᵢ|) × 100] / valid nonzero actual values sMAPE = [Σ(2 × |eᵢ| / (|Actualᵢ| + |Predictedᵢ|)) × 100] / n R² = 1 − (SSE / SST) Adjusted R² = 1 − [(1 − R²) × (n − 1) / (n − p − 1)] Mean Error = Σ(Actualᵢ − Predictedᵢ) / n Residual Std. Error = √[SSE / (n − p − 1)] Durbin-Watson = Σ(eᵢ − eᵢ₋₁)² / Σeᵢ²

Here, n is the number of observations, p is the number of predictors, SSE is the sum of squared errors, and SST is the total sum of squares.

How to Use This Calculator

  1. Paste the observed target values into the Actual Values field.
  2. Paste your model outputs into the Predicted Values field.
  3. Enter the number of predictors if you want adjusted R² and residual standard error.
  4. Set the display precision using decimal places.
  5. Click Calculate Accuracy to generate metrics and the performance graph.
  6. Review the summary cards, observation error table, and chart.
  7. Use the CSV and PDF buttons to export the generated report.

Frequently Asked Questions

1. What does this regression accuracy tool measure?

It measures how close predicted values are to actual values. The tool reports error metrics, fit scores, residual diagnostics, and trend visuals for deeper evaluation.

2. Why are MAE and RMSE both included?

MAE shows the average absolute miss, while RMSE penalizes larger mistakes more heavily. Using both helps you judge typical error and sensitivity to outliers.

3. When should I use adjusted R²?

Use adjusted R² when comparing models with different numbers of predictors. It penalizes extra variables that do not improve model quality enough.

4. Why can MAPE show N/A?

MAPE divides by actual values. If actual values include zeros, those rows cannot be used safely for MAPE. The tool skips zero actuals automatically.

5. What does a negative mean error indicate?

A negative mean error means predictions tend to be higher than actual values overall. That suggests the model is biased toward overprediction.

6. What is the purpose of Durbin-Watson?

Durbin-Watson checks whether residuals are correlated in sequence. Values near 2 usually suggest low autocorrelation, while extremes can signal pattern problems.

7. Can I compare two different models here?

Yes. Run the calculator once for each model using the same actual values. Then compare MAE, RMSE, R², bias, and residual behavior.

8. Does a high R² always mean the model is good?

No. A high R² can still hide bias, large outliers, or unstable residual patterns. Always review error metrics and diagnostic behavior together.

Related Calculators

Model Fit ScoreRegression R SquaredAdjusted Model FitMultiple R SquaredExplained Variance ScoreRegression Fit IndexModel Accuracy ScoreLinear Model FitRegression Performance ScoreR Squared Online

Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.