Build regression estimates from streamed or stored observations. See coefficients, residuals, covariance, and predictions instantly. Use clear inputs, exports, formulas, examples, and practical guidance.
| Observation | X1 | X2 | Y |
|---|---|---|---|
| 1 | 1 | 2 | 2.6 |
| 2 | 2 | 1 | 4.8 |
| 3 | 3 | 2 | 5.9 |
| 4 | 4 | 3 | 6.8 |
| 5 | 5 | 4 | 7.9 |
| 6 | 6 | 5 | 8.4 |
This sample works with an intercept, two predictors, and one target column. Use it to test the calculator quickly.
Recursive least squares updates the coefficient vector after each observation instead of refitting the entire regression model from the beginning.
State vector: θk = θk-1 + Kkek
Prediction error: ek = yk - φkTθk-1
Gain vector: Kk = Pk-1φk / (λ + φkTPk-1φk)
Covariance update: Pk = (Pk-1 - KkφkTPk-1) / λ
Where: θ is the coefficient vector, φ is the predictor vector, λ is the forgetting factor, and P is the covariance matrix.
Recursive least squares is a powerful method for adaptive regression. It updates coefficients one observation at a time. This makes it useful when data arrives sequentially. It also helps when model relationships drift over time.
A standard least squares fit usually waits for the full dataset. Recursive least squares updates the estimate after every new row. That reduces repeated full recalculation. It is efficient for streaming systems, monitoring tasks, forecasting pipelines, and online estimation problems.
This calculator estimates regression coefficients with the recursive least squares method. You can enter multiple predictors, choose an intercept, set a forgetting factor, and define the initial covariance scale. The tool reports coefficients, residual measures, prediction quality, and stepwise updates. It also produces a next-point forecast when you provide predictor values.
The results section shows the final coefficient vector, fitted equation, sample count, error totals, mean squared error, root mean squared error, mean absolute error, and coefficient of determination. A detailed update table also shows each observation, the predicted value before updating, the residual, and the evolving coefficients.
Use recursive least squares when you want adaptive estimation. It is helpful for sensor calibration, financial modeling, demand tracking, industrial control, quality monitoring, and time-varying statistical relationships. The forgetting factor lets newer observations carry more influence than older ones.
Start with clean predictor names and well-structured observation rows. Keep scales reasonable when variables differ strongly. Choose a forgetting factor near one for stable systems. Use a smaller value when the process changes faster. Review residuals and coefficient paths before trusting forecasts. Export the results for reporting, comparison, or audit work.
Each step uses the current predictor vector, previous covariance matrix, and previous coefficient estimate. The gain vector controls how strongly the new observation changes the model. Large uncertainty usually gives a larger update. As uncertainty falls, coefficient movement becomes smaller. That behavior makes the method stable, fast, and suitable for repeated operational use. You can test example data before using live operational inputs safely.
It estimates regression coefficients by updating them after each new observation. This helps when you want online learning or adaptive fitting without recalculating a full batch regression every time.
The forgetting factor controls how much past data influences new estimates. Values close to 1 keep longer memory. Smaller values react faster to changing patterns.
The initial covariance scale affects early flexibility. Larger values allow bigger first updates. Smaller values make the model more conservative at the start.
Yes. Enter predictor names in order and provide each observation row with the same number of predictor values. Put the target value last on every line.
Usually yes, unless theory says the response must pass through zero when all predictors are zero. The checkbox adds that constant term automatically.
The gain vector shows how strongly the current observation adjusts each coefficient. Bigger gain values mean the new row has more immediate influence on the update.
The prediction error is actual minus predicted response before the update. Positive error means the model predicted too low. Negative error means it predicted too high.
It is better when data arrives one row at a time, when relationships drift, or when repeated full refits would be slow for operational work.
Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.