130 likes | 231 Vues
CS B553: Algorithms for Optimization and Learning. Constrained optimization. Key Concepts. Constraint formulations Necessary o ptimality conditions Lagrange multipliers: equality constraints KKT conditions: equalities and inequalities. Figure 1. Objective function f. Figure 1.
E N D
CS B553: Algorithms for Optimization and Learning Constrained optimization
Key Concepts • Constraint formulations • Necessary optimality conditions • Lagrange multipliers: equality constraints • KKT conditions: equalities and inequalities
Figure 1 Objective function f
Figure 1 Objective function f + feasible set S S
Figure 2 Local minima are either local minima of f or on the boundary of S S
Figure 3 Bound constraints Linear inequalities Ax b u2 S S Ai (row i) l2 l1 u1 bi Linear equalities Ax=b General, nonlinear constraints h1(x)0 A1 S S h2(x)0 g1(x)=0 b1
Figure 4 Lagrange multipliers: one equality constraint At a local minimum (or maximum), the gradient of the objectiveand the constraint must be parallel f(x2) g(x2) x2 f(x1) x1 g(x)=0 g(x1)
Figure 5 If the constraint gradient and the objective gradient are notparallel, then there exists some direction v that you can move into change f without changing g(x) v g(x) x f(x)
Figure 6 Interpretation: Suppose x* is a global minimum. I were to relax the constraint g(x)=0 at a constant ratetoward g(x)=1, the value of tells me the rate of decrease off(x*). f(x*) = - g(x*) x* g(x*)
Figure 7 One inequality constraint h(x) 0. Either: h(x)> 0 h(x)< 0 h(x)= 0
Figure 7 One inequality constraint h(x) 0. Either: 1. x is a critical point of f with h(x) < 0, or h(x)> 0 h(x)< 0 h(x)= 0 f(x1)=0
Figure 7 One inequality constraint h(x) 0. Either: 1. x is a critical point of f with h(x) < 0, or 2. x is on boundary h(x) = 0 and satisfies a Lagrangian condition h(x)> 0 h(x)< 0 f(x3) h(x)= 0 g(x3) x3 f(x2) x2 g(x2)
Figure 8 Multiple inequality constraints h2(x) < 0 h1(x)< 0 x x