Unconstrained Optimization Calculator

Solve quadratic models using gradients, Hessians, and updates. Review paths and classify stationary points clearly. Export polished results for deeper analysis and confident reporting.

Calculator Inputs

Optimize the quadratic function f(x,y) = ax² + by² + cxy + dx + ey + f using gradient descent, momentum, or Newton updates.

Example Data Table

Function Start Point Method Expected Stationary Point Classification
f(x,y) = x² + y² - 6x + 4y + 5 (8, -8) Gradient Descent (3, -2) Local minimum
Set a=1, b=1, c=0, d=-6, e=4, f=5 Use α=0.10, tol=0.000001 Newton Method Converges in one damped step with damping 1 Strictly convex surface

Formula Used

This calculator works with the quadratic objective f(x,y) = ax² + by² + cxy + dx + ey + f.

The gradient is ∇f(x,y) = [2ax + cy + d, 2by + cx + e].

The Hessian matrix is H = [[2a, c], [c, 2b]]. It stays constant for quadratic objectives.

The stationary point solves H [x, y]^T = -[d, e]^T, provided det(H) ≠ 0.

The second-order classification uses the Hessian determinant: if det(H) > 0 and 2a > 0, the point is a local minimum. If det(H) > 0 and 2a < 0, it is a local maximum. If det(H) < 0, the point is a saddle point.

Update rules:

How to Use This Calculator

  1. Enter the quadratic coefficients for the objective function.
  2. Choose a starting point for x and y.
  3. Select an optimization method.
  4. Set learning rate, momentum, damping, tolerance, and iteration limit.
  5. Press Calculate Optimization.
  6. Review the final point, gradient norm, Hessian metrics, and classification.
  7. Inspect the charts to understand convergence behavior and the optimization path.
  8. Download the iteration log as CSV or export the result area as PDF.

FAQs

1) What does this calculator optimize?

It optimizes a two-variable quadratic objective function. The page computes gradients, Hessian properties, stationary points, convergence history, and the final numerical estimate from your chosen method.

2) Why is the Hessian important?

The Hessian measures curvature. Its determinant, trace, and eigenvalues help classify the stationary point and reveal whether the surface is convex, concave, saddle-shaped, or ill-conditioned.

3) When should I use gradient descent?

Use gradient descent when you want a simple, controlled iterative path. It is easy to tune, but it may require many iterations if the curvature is steep in one direction.

4) What is momentum doing here?

Momentum carries part of the previous step forward. This can reduce zigzagging and often speeds movement across narrow valleys, especially when the objective is elongated.

5) Why can Newton method be faster?

Newton uses second-order curvature information. For a well-behaved quadratic with an invertible Hessian, it can reach the stationary point in very few steps, sometimes one.

6) What does condition number mean?

The condition number estimates sensitivity. A large value means the surface is stretched or nearly singular, which can make iterative progress slower and numerical updates less stable.

7) Why would the calculator report divergence?

Divergence usually means the learning rate is too large, the Hessian is singular for Newton updates, or the chosen method is unstable for the supplied curvature and starting point.

8) Can this page solve non-quadratic objectives?

No. This implementation is designed for quadratic unconstrained optimization in two variables. That restriction allows exact Hessian analysis, stationary classification, and a clear contour plot.

Related Calculators

goal programming calculatornonlinear least squares calculatorcalculus of variations solverprofit optimization calculatordynamic optimization calculatorstep size calculatorrevenue maximization calculatorsimulated annealing calculator

Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.