Advanced Gradient Descent Calculator

Explore convergence paths, gradients, and step sizes with greater practical confidence. Test smarter settings today. See every update, export results, and refine optimization decisions.

Gradient Descent Input Form

Coefficient for x².
Coefficient for y².
Coefficient for xy.
Coefficient for x.
Coefficient for y.
Adds vertical shift only.
Starting point for x.
Starting point for y.
Step multiplier for each update.
Converges when gradient norm is small enough.
Hard stop for the optimization loop.
Choose the update rule.
Used by momentum and Nesterov methods.
Controls how the rate changes.
Used by time, step, and exponential schedules.
Interval for the step schedule.
Set zero to disable clipping.

Example Data Table

Scenario Objective Function Start Point Method Learning Rate Approximate Minimum Approximate Value
Convex baseline 3x² + 2y² + xy - 8x + 6y + 4 (4, -4) Standard 0.08 (1.6522, -1.9130) -8.3478
Momentum tuned 4x² + 3y² + 0.5xy - 10x - 4y + 2 (5, 1) Momentum 0.05 (1.2340, 0.5638) -4.3122
Nesterov with decay 2x² + 5y² - xy + 7x - 3y + 1 (-3, 2) Nesterov 0.04 (-1.6327, 0.2735) -4.9951

Formula Used

Objective function f(x, y) = ax² + by² + cxy + dx + ey + f
Gradient components ∂f/∂x = 2ax + cy + d ∂f/∂y = 2by + cx + e
Standard update xₖ₊₁ = xₖ - ηₖ(∂f/∂x), yₖ₊₁ = yₖ - ηₖ(∂f/∂y)
Momentum update vₖ₊₁ = βvₖ + ηₖ∇f(xₖ, yₖ), pₖ₊₁ = pₖ - vₖ₊₁
Nesterov update Evaluate the gradient at pₖ - βvₖ, then update with the new velocity.
Learning rate schedules Time decay: ηₖ = η₀ / (1 + decay·k) Step decay: ηₖ = η₀ · decay^floor(k / step) Exponential decay: ηₖ = η₀ · e^(-decay·k)

Gradient clipping rescales the gradient whenever its norm exceeds your selected limit. This can stabilize difficult runs with aggressive learning rates or steep contours.

How to Use This Calculator

  1. Enter the quadratic coefficients for x², y², xy, x, y, and the constant term.
  2. Choose an initial point that represents your starting guess.
  3. Set the base learning rate, tolerance, and maximum iterations.
  4. Select the update method: standard, momentum, or Nesterov.
  5. Pick a schedule if you want the learning rate to shrink over time.
  6. Use gradient clipping when large gradients might cause unstable jumps.
  7. Press the calculate button and review the result panel above the form.
  8. Download the iteration log as CSV or export the report as PDF.

Frequently Asked Questions

1. What does this calculator optimize?

It minimizes a two-variable quadratic function. You control the coefficients, starting point, learning settings, stopping rule, update method, and optional gradient clipping.

2. Why can a run diverge?

Divergence usually happens when the learning rate is too large, the surface is poorly conditioned, or momentum pushes the path too far. Lower the rate or enable clipping.

3. When should I use momentum?

Momentum helps when gradients oscillate across narrow valleys. It smooths updates and can reduce the number of iterations needed to approach the minimum.

4. What is Nesterov acceleration?

Nesterov acceleration looks ahead before measuring the gradient. This often provides a more informed correction than ordinary momentum, especially on curved surfaces.

5. How does tolerance affect convergence?

Tolerance sets the gradient norm threshold for stopping. Smaller values demand a more precise solution but may require more iterations and better tuning.

6. What does the analytical benchmark show?

It solves the stationary system directly when the Hessian determinant is nonzero. You can compare the iterative answer with the closed-form reference point.

7. When is the quadratic strictly convex?

For this two-variable model, strict convexity occurs when 2a is positive and the Hessian determinant 4ab - c² is also positive.

8. What do the CSV and PDF exports include?

The exports include the summary metrics and the iteration history, making it easier to document runs, compare settings, and share optimization results.

Related Calculators

utility maximization calculatorfeasible region finderlearning rate optimizer

Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.