1 / 37

Inexact SQP Methods for Equality Constrained Optimization

INFORMS Annual Meeting 2006. Inexact SQP Methods for Equality Constrained Optimization. Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal November 6, 2006. Outline. Introduction Problem formulation Motivation for inexactness

Télécharger la présentation

Inexact SQP Methods for Equality Constrained Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INFORMS Annual Meeting 2006 Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal November 6, 2006

  2. Outline • Introduction • Problem formulation • Motivation for inexactness • Unconstrained optimization and nonlinear equations • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  3. Outline • Introduction • Problem formulation • Motivation for inexactness • Unconstrained optimization and nonlinear equations • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  4. Equality constrained optimization Goal: solve the problem Define: the derivatives Define: the Lagrangian Goal: solve KKT conditions

  5. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem

  6. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • KKT matrix • Cannot be formed • Cannot be factored

  7. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • KKT matrix • Cannot be formed • Cannot be factored • Linear system solve • Iterative method • Inexactness

  8. Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)

  9. Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method Note: choosing any step with and ensures global convergence (Dembo, Eisenstat, and Steihaug, 1982) (Eisenstat and Walker, 1994)

  10. Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  11. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem Question: can we ensure convergence to a local solution by choosing any step into the ball?

  12. Globalization strategy • Step computation: inexact SQP step • Globalization strategy: exact merit function … with Armijo line search condition

  13. First attempt • Proposition: sufficiently small residual • Test: 61 problems from CUTEr test set

  14. First attempt… not robust • Proposition: sufficiently small residual • … not enough for complete robustness • We have multiple goals (feasibility and optimality) • Lagrange multipliers may be completely off

  15. Second attempt • Step computation: inexact SQP step • Recall the line search condition • We can show

  16. Second attempt • Step computation: inexact SQP step • Recall the line search condition • We can show ... but how negative should this be?

  17. Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step

  18. Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step

  19. Exact case

  20. Exact case Exact step minimizes the objective on the linearized constraints

  21. Exact case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the objective (but that’s ok)

  22. Inexact case

  23. Option #1: current penalty parameter

  24. Option #1: current penalty parameter Step is acceptable if for

  25. Option #2: new penalty parameter

  26. Option #2: new penalty parameter Step is acceptable if for

  27. Option #2: new penalty parameter Step is acceptable if for

  28. Algorithm outline • for k = 0, 1, 2, … • Iteratively solve • Until • Update penalty parameter • Perform backtracking line search • Update iterate or

  29. Termination test • Observe KKT conditions

  30. Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  31. Assumptions • The sequence of iterates is contained in a convex set over which the following hold: • the objective function is bounded below • the objective and constraint functions and their first and second derivatives are uniformly bounded in norm • the constraint Jacobian has full row rank and its smallest singular value is bounded below by a positive constant • the Hessian of the Lagrangian is positive definite with smallest eigenvalue bounded below by a positive constant

  32. Sufficient reduction to sufficient decrease • Taylor expansion of merit function yields • Accepted step satisfies

  33. Intermediate results is bounded above is bounded above is bounded below by a positive constant

  34. Sufficient decrease in merit function

  35. Step in dual space • We converge to an optimal primal solution, and (for sufficiently small and ) Therefore,

  36. Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  37. Conclusion/Final remarks • Review • Defined a globally convergent inexact SQP algorithm • Require only inexact solutions of KKT system • Require only matrix-vector products involving objective and constraint function derivatives • Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite • Future challenges • Implementation and appropriate parameter values • Nearly-singular constraint Jacobian • Inexact derivative information • Negative curvature • etc., etc., etc….

More Related