Solver Inputs
Enter a quadratic model in the form f(x, y) = ax² + by² + cxy + dx + ey + k, choose bounds, set a starting point, and run the solver.
Formula Used
The solver uses the quadratic objective f(x, y) = ax² + by² + cxy + dx + ey + k.
Its gradient is ∇f(x, y) = [2ax + cy + d, 2by + cx + e]. For minimization, the step moves opposite the gradient. For maximization, it moves with the gradient.
The iterative update is vₜ = m·vₜ₋₁ + s·η·∇f, where m is momentum, η is the learning rate, and s is -1 for minimization or +1 for maximization.
The new point becomes (x, y) = projection[(x, y) + vₜ], which clamps each coordinate to the user-defined bounds.
The exact box solution is checked separately by evaluating all corner points, valid interior stationary points, and valid boundary stationary points.
How to Use This Calculator
- Enter coefficients for the quadratic objective function.
- Set feasible lower and upper bounds for both variables.
- Choose a starting guess, learning rate, momentum, and stopping settings.
- Select minimize or maximize, then click the solver button.
- Review the exact optimum, trajectory table, convergence gap, and Plotly graph.
Example Data Table
| Mode | Objective | Bounds | Start | Learning Rate | Momentum | Exact x* | Exact y* | Exact Objective |
|---|---|---|---|---|---|---|---|---|
| Minimize | 3x² + 2y² - 4xy - 8x + 6y + 10 | x, y ∈ [-5, 5] | (4, -3) | 0.08 | 0.20 | 1.000000 | -0.500000 | 4.500000 |
FAQs
1) What kind of optimization problem does this calculator solve?
This version solves two-variable quadratic objectives with rectangular bounds. It supports minimization or maximization, projected gradient updates, stationary-point checks, and exact candidate evaluation on the feasible box.
2) Why does the tool show both an exact optimum and an iterative estimate?
The exact optimum verifies the best point among valid analytical candidates. The iterative estimate shows how a numerical solver behaves from your chosen starting point, helping you study convergence quality and parameter sensitivity.
3) What does the gradient norm tell me?
The gradient norm measures local steepness. Small values usually mean the solver is near a stationary region. Large values mean stronger descent or ascent directions still exist.
4) Why do bounds matter in optimization?
Bounds define feasibility. A stationary point may exist mathematically but still lie outside the allowed range. In that case, the best feasible answer often moves to a boundary or a corner.
5) How should I choose the learning rate?
Start with a moderate value like 0.01 to 0.10. If the path oscillates or hits bounds too aggressively, reduce it. If movement is extremely slow, increase it carefully.
6) What does momentum change?
Momentum smooths updates by carrying part of the previous step into the next one. It can accelerate progress on shallow regions, but excessive momentum may overshoot narrow valleys.
7) Can this calculator handle non-quadratic functions?
No. This implementation is designed specifically for quadratic objectives in two variables. For general nonlinear programming, you would need a broader solver and additional constraint handling.
8) What do the CSV and PDF downloads include?
The CSV includes the result summary and the full iterative path. The PDF provides a compact printable summary with key metrics and a short preview of trajectory rows.