Simple Convex Optimization Solver Calculator

Enter coefficients, bounds, and tolerances for accurate analysis. Review optimum values, gradients, residuals, and iterations. Download tables and printable summaries for study or reporting.

Calculator Input

Example Data Table

Field Example Value Purpose
a4Quadratic weight on x²
b6Quadratic weight on y²
c1Cross interaction term xy
d-8Linear tilt for x
e-10Linear tilt for y
f₀0Constant offset
x bounds0 to 5Allowed x interval
y bounds0 to 5Allowed y interval
Start point(0, 0)Initial guess
Learning rate0.10Projected descent step
Max iterations250Iteration cap
Tolerance0.000001Stopping rule
Expected optimumApproximately (1.652174, 1.391304)Reference solution
Expected minimumApproximately -13.565217Reference objective

Formula Used

This calculator solves a box-constrained convex quadratic minimization problem.

Objective function:

f(x, y) = 0.5ax² + 0.5by² + cxy + dx + ey + f₀

Gradient:

∇f(x, y) = [ax + cy + d, cx + by + e]

Projected update:

xk+1 = clip(xk - α(axk + cyk + d), xmin, xmax)

yk+1 = clip(yk - α(cxk + byk + e), ymin, ymax)

Convexity check:

a ≥ 0, b ≥ 0, and ab - c² ≥ 0

The clip step keeps each update inside the chosen bounds. The projected gradient norm helps measure how close the current point is to the constrained optimum.

How to Use This Calculator

  1. Enter the quadratic coefficients a, b, and c.
  2. Enter the linear coefficients d and e, plus the constant term f₀.
  3. Set lower and upper bounds for x and y.
  4. Choose a starting point inside the bounds.
  5. Set a learning rate, maximum iterations, and tolerance.
  6. Click the solve button to run projected gradient descent.
  7. Review the optimum point, minimum value, gradient norms, and iteration history.
  8. Use the CSV and PDF buttons to save the generated report.

About This Simple Convex Optimization Solver Calculator

Why convex optimization matters

Simple convex optimization is a core topic in maths. It helps you find the smallest value of a well-structured function. This calculator focuses on a convex quadratic model with box constraints. That setup is common in optimization classes, numerical analysis, and applied modelling.

What this calculator solves

The page minimizes a two-variable quadratic objective. The function includes squared terms, a cross term, linear terms, and a constant. The solver also respects lower and upper bounds for both variables. Those bounds make the tool useful for constrained minimization practice.

How the method works

The calculator uses projected gradient descent. First, it computes the gradient of the current point. Next, it moves in the opposite direction of the gradient. Then it clips the new point so it stays inside the selected bounds. This process repeats until the step size and projected gradient become very small, or until the iteration limit is reached.

Why convexity is checked first

A convex quadratic objective has an important property. Any local minimum is also a global minimum. That makes the result easier to interpret. The tool checks convexity through the Hessian conditions a ≥ 0, b ≥ 0, and ab - c² ≥ 0. If those conditions fail, the page shows a warning instead of solving an invalid convex case.

What the output tells you

The result section reports the estimated optimum x and y values. It also shows the minimum objective value, final gradient, projected gradient norm, determinant, and feasibility against the bounds. The iteration table makes the search path easier to study. This is helpful for learning convergence behaviour step by step.

Who can use it

This simple convex optimization solver calculator is useful for students, teachers, analysts, and anyone reviewing constrained quadratic models. It supports classroom examples, self-study, homework checks, and method comparisons. The export options also make it easy to save reports for later review or documentation.

Frequently Asked Questions

1. What type of problem does this calculator solve?

It solves a two-variable convex quadratic minimization problem with lower and upper bounds on x and y. The method is numerical and uses projected gradient descent.

2. Why does the calculator require convex coefficients?

Convexity ensures the minimum is globally reliable for this model. Without convexity, projected gradient descent may not represent the intended convex optimization problem.

3. What does the cross coefficient c do?

The c term links x and y through the xy product. It changes the curvature and can rotate the shape of the quadratic surface.

4. Why are bounds useful in optimization?

Bounds restrict the search region. They model practical limits and keep the solution inside an allowed interval for each decision variable.

5. What is the projected gradient norm?

It measures how close the current bounded point is to a constrained optimum. Smaller values usually indicate a better stopping point.

6. How should I choose the learning rate?

Start with a small positive value such as 0.05 or 0.10. If progress is slow, increase it carefully. If results oscillate, reduce it.

7. Why might the solver stop at the iteration limit?

The step size may be too small, the tolerance may be strict, or the learning rate may be unsuitable. Adjust those settings and solve again.

8. Can I use this for teaching and coursework?

Yes. The calculator is well suited for maths practice, optimization demonstrations, convergence checks, and simple constrained quadratic examples.

Related Calculators

utility maximization calculatordeep learning optimizersimplex method solvergradient descent calculatorlasso regression solverinterior point solverconvex set projectionfeasible region findertrajectory optimization toolconvex combination calculator

Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.