1 / 49

ENGINEERING OPTIMIZATION Methods and Applications

ENGINEERING OPTIMIZATION Methods and Applications. A. Ravindran, K. M. Ragsdell, G. V. Reklaitis. Book Review. Chapter 5: Constrained Optimality Criteria. Part 1: Ferhat Dikbiyik Part 2:Yi Zhang. Review Session July 2, 2010. Constraints: Good guys or bad guys?. Constraints:

oshin
Télécharger la présentation

ENGINEERING OPTIMIZATION Methods and Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review

  2. Chapter 5: Constrained Optimality Criteria Part 1: Ferhat Dikbiyik Part 2:Yi Zhang Review Session July 2, 2010

  3. Constraints: Good guys or bad guys?

  4. Constraints: Good guys or bad guys? reduces the region in which we search for optimum.

  5. Constraints: Good guys or bad guys? makes optimization process very complicated

  6. Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem

  7. Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem

  8. Equality-Constrained Problems GOAL solving the problem as an unconstrained problem by explicitly eliminating K independent variables using the equality constraints

  9. Example 5.1

  10. What if?

  11. Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem

  12. Lagrange Multipliers Converting constrained problem to an unconstrained problem with help of certain unspecified parameters known as Lagrange Multipliers

  13. Lagrange Multipliers Lagrange function

  14. Lagrange Multipliers Lagrange multiplier

  15. Example 5.2

  16. Test whether the stationary point corresponds to a minimum positive definite

  17. Example 5.3

  18. max positive definite negative definite

  19. Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem

  20. Economic Interpretation of Lagrange Multipliers The Lagrange multipliers have an important economic interpretation as shadow prices of the constraints, and their optimal values are very useful in sensitivity analysis.

  21. Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem

  22. Kuhn-Tucker Conditions

  23. NLP problem

  24. Kuhn-Tucker conditions (aka Kuhn-Tucker Problem)

  25. Example 5.4

  26. Example 5.4

  27. Example 5.4

  28. Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem

  29. Kuhn-Tucker Theorems Kuhn – Tucker Necessity Theorem Kuhn – Tucker Sufficient Theorem

  30. Kuhn-Tucker Necessity Theorem • Let • f, g, and h be differentiable functions • x* be a feasible solution to the NLP problem. • and for k=1,….,K are linearly independent

  31. Kuhn-Tucker Necessity Theorem • Let • f, g, and h be differentiable functions x* be a feasible solution to the NLP problem. • and for k=1,….,K are linearly independent at the optimum • If x* is an optimal solution to the NLP problem, then there exists a (u*, v*) such that (x*,u*, v*) solves the KTP given by KTC. Constraint qualification ! Hard to verify, since it requires that the optimum solution be known beforehand !

  32. Kuhn-Tucker Necessity Theorem For certain special NLP problems, the constraint qualification is satisfied: When all the inequality and equality constraints are linear When all the inequality constraints are concave functions and equality constraints are linear ! When the constraint qualification is not met at the optimum, there may not exist a solution to the KTP

  33. Example 5.5 x* = (1, 0) and for k=1,….,K are linearly independent at the optimum

  34. Example 5.5 x* = (1, 0) No Kuhn-Tucker point at the optimum

  35. Kuhn-Tucker Necessity Theorem Given a feasible point that satisfies the constraint qualification not optimal optimal If it does not satisfy the KTCs If it does satisfy the KTCs

  36. Example 5.6

  37. Kuhn-Tucker Sufficiency Theorem • Let • f(x) be convex • the inequality constraints gj(x) for j=1,…,J be all concave function • the equality constraints hk(x) for k=1,…,K be linear If there exists a solution (x*,u*,v*) that satisfies KTCs, then x* is an optimal solution

  38. Example 5.4 • f(x) be convex • the inequality constraints gj(x) for j=1,…,J be all concave function • the equality constraints hk(x) for k=1,…,K be linear

  39. Example 5.4 • f(x) be convex semi-definite

  40. Example 5.4 • f(x) be convex • the inequality constraints gj(x) for j=1,…,J be all concave function v g1(x) linear, hence both convex and concave negative definite

  41. Example 5.4 • f(x) be convex • the inequality constraints gj(x) for j=1,…,J be all concave function • the equality constraints hk(x) for k=1,…,K be linear v

  42. Remarks For practical problems, the constraint qualification will generally hold. If the functions are differentiable, a Kuhn–Tucker point is a possible candidate for the optimum. Hence, many of the NLP methods attempt to converge to a Kuhn–Tucker point.

  43. Remarks When the sufficiency conditions of Theorem 5.2 hold, a Kuhn–Tucker point automatically becomes the global minimum. Unfortunately, the sufficiency conditions are difficult to verify, and often practical problems may not possess these nice properties. Note that the presence of one nonlinear equality constraint is enough to violate the assumptions of Theorem 5.2

  44. Remarks The sufficiency conditions of Theorem 5.2 have been generalized further to nonconvex inequality constraints, nonconvex objectives, and nonlinear equality constraints. These use generalizations of convex functions such as quasi-convex and pseudoconvex functions

More Related