1 / 42

Outline

Outline. relationship among topics secrets LP with upper bounds by Simplex method basic feasible solution ( BFS ) by Simplex method for bounded variables extended basic feasible solution ( EBFS ) optimality conditions for bounded variables ideas of the proof examples

nguyet
Télécharger la présentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Outline • relationship among topics • secrets • LP with upper bounds • by Simplex method • basic feasible solution (BFS) • by Simplex method for bounded variables • extended basic feasible solution (EBFS) • optimality conditions for bounded variables • ideas of the proof • examples • Example 1 for ideas but inexact • Example 2 for the exact procedure 1

  2. A Depot for Multiple Products • multi-product by a fleet of trucks Possible Formulation: objective function common constraints, e.g., trucks, DC capacity, etc. network constraints for type-1 product network constraints for type-1 product depot .... network constraints for type-1 product non-negativity constraints 2

  3. A General Type of Optimization Problems • structure of many problems: • network constraints: easy • other constraints: hard • making use of the easy constraints to solve the problems • solution methods: large-scale optimization • column generation, Lagrangian relaxation, Dantzig-Wolfe decomposition … • basis: linear programming, network optimization (and also non-linear optimization, integer optimization, combinatorial optimization) objective function network constraints hard constraints non-negativity constraints 3

  4. Relationship of Solution Techniques linear prog. • two directions of theoretical development for network programming • from special structures of networks • from linear programming • ideal: understanding development in both directions network prog. int. prog. non-linear prog. dynamic prog. … 4

  5. Relationship of Solution Techniques minimum cost flow column generation, Dantzig-Wolfe decomposition network algorithms network simplex revised simplex method shortest-path algorithms simplex method Lagrangian relaxation linear algebra non-linear optimization 5

  6. Our Topics • simplex method for bounded variables • linkage between LP and network simplex • optimality conditions for minimum cost flow networks • minimum cost algorithms • standard, and successive shortest path • equivalence among network and LP optimality conditions • revised simplex • column generation • Dantzig-Wolfe decomposition • Lagrangian relaxation It takes more than one semester to cover these topics in detail! We will only cover the ideas. 6

  7. Secrets 7

  8. The Most Beautiful … 8

  9. Maybe the Most Beautiful of All… • linear algebra geometric properties algebraic properties matrix properties 9

  10. LP with Upper Bounds 10

  11. LP with Upper Bounds • upper bounds: common in network problems, e.g., an arc with finite capacity • quite some theory of network optimization being from LP 11

  12. To Solve LP with Upper Bounds • incorporate the upper-bound constraints into the set of functional constraints and solve accordingly  12

  13. To Solve LP with Upper Bounds • In the simplex method the lower bound constraints 0  x do not appear in A. • Is it possible to work only with A even with upper-bound constraints? • Yes.  13

  14. BFS for Standard LP • Amn, m  n, of rank m • basic feasible solution (BFS)x of LP, i.e., • feasible: Ax  b, 0 x • basic • non-basic variables: (at least) n-m variables = 0 • basic variables: m non-negative variables with linearly independent columns 14

  15. Extended Basic Feasible Solution of LP with Bounded Variables • Amn, m  n, of rank m • extended basic feasible solution ( EBFS ) x of LP with bounded variables, i.e., • feasible: Ax  b, 0 x u • basic solution • non-basic variables: (at least) n-m variables = 0, or = their upper bounds • Basic variables: m variables of the form 0  xi  ui, with linearly independent columns 15

  16. Optimality Conditions of Standard LP • Maximum Conditions: BFSx is maximal if •  0 for all non-basic variable xj = 0 • MinimumConditions: BFSx is minimal if •  0 for all non-basic variable xj = 0 • intuition • : increase of the objective function by unit increase in xj • maximum condition: no good to increase non-basic xj • minimum condition: no good to decrease non-basic xj 16

  17. Optimality Conditions of LP with Bounded Variables • Maximum Conditions: EBFSx is maximal if •  0 for all non-basic variable xj = 0, and •  0 for all non-basic variable xj = uj • MinimumConditions: EBFSx is minimal if •  0 for all non-basic variable xj = 0, and •  0 for all non-basic variable xj = uj 17

  18. How to Prove? 18

  19. General Idea • optimality conditions of the EBFS • from duality theory and complementary slackness conditions 19

  20. Complementary Slackness Conditions • primal-dual pair • Theorem 1 (Complementary Slackness Conditions) • if x primal feasible and ydual feasible • then xprimal optimal and ydual optimal iffxj(yTAjcj) = 0 for all j, and yi(biAix) = 0 for all i 20

  21. Complementary Slackness Conditions • primal-dual pair • Theorem 2 (Necessary and Sufficient Condition) • if x primal feasible • then x primal optimal iff there exists dual feasible y such that x and y satisfy the Complementary Slackness Conditions 21

  22. Complementary Slackness Conditions for LP with Bounded Variables • by Theorem 2, primal feasible x and dual feasible (yT, T) are optimal iff • xj(yTAj+ j - cj­) = 0, j • yi(bi- Aix) = 0, i • j(uj - xj­) = 0, j 22

  23. General Idea of the Proof • optimality conditions of the EBFS • from duality theory and complementary slackness conditions • ideas of the proof • given an EBFSx satisfying the upper-bound optimality conditions • then possible to find dual feasible variables (yT, T)Tsuch that x and (yT, T)Tsatisfy the complementary slackness conditions 23

  24. Example 1. Upper-Bound Constraints as Functional Constraints • max 2x + 5y,  min 2x 5y, • s.t. • x + 2y 20, • 2x + y 16, • 0 x 2, 0 y 8. 24

  25. Examples of LP with Bounded Variables 25

  26. Example 1. Upper-Bound Constraints as Functional Constraints • min 2x 5y, • s.t. • x + 2y 20, • 2x + y 16, • 0 x 2, 0 y 8. • max. value = 44 • x* = 2 and y* = 8 26

  27. The following procedure is not exactly the Simplex Method for Bounded Variables. It primarily brings out the ideas of the exact method. 27

  28. Example 1. Upper-Bound Constraints by Optimality Conditions of Bounded Variables • y as the entering variable • 2y + s1 = 20 • y + s2 = 16 • y 8 -5 min 2x 5y, s.t. x + 2y 20, 2x + y 16, 0 x 2, 0 y 8. 28

  29. Example 1. Upper-Bound Constraints by Optimality Conditions of Bounded Variables • mark the non-basic variable y at its upper bound • for y = 8 • obj. fun.: -2x – 5y – z = 0  -2x - z = 40 • eqt. (1): x + 2y + s1 = 20  x + s1 = 4 • eqt. (2): 2x + y + s2 = 16  2x + s2 = 8 29

  30. Example 1. Upper-Bound Constraints by Optimality Conditions of Bounded Variables • x as the entering variable • x + s1 = 4 • 2x + s2 = 8 • x 2 min 2x 5y, s.t. x + 2y 20, 2x + y 16, 0 x 2, 0 y 8. 30

  31. Example 1. Upper-Bound Constraints by Optimality Conditions of Bounded Variables • for x at its upper bound 2, mark x, and • obj. fun.: -2x – z = 40  -z = 44 • eqt. (1): x + s1 = 4  s1 = 2 • eqt. (2): 2x + s2 = 8  s2 = 4 min 2x 5y, s.t. x + 2y 20, 2x + y 16, 0 x 2, 0 y 8. 31

  32. Example 1. Upper-Bound Constraints by Optimality Conditions of Bounded Variables • satisfying the optimality condition for bounded variables •  0 for all non-basic variable xj = 0, and •  0 for all non-basic variable xj = uj • z* = -44, with x* = 2 and y* = 8 32

  33. Example 1 Being Too Specific • in general, variables swapping among all sorts of status • non-basic at 0 • basic at 0 • basic between 0 and upper bound • basic at upper bound • non-basic at upper bound • Simplex method for bounded variables: a special algorithm to record all possibilities 33

  34. The following example follows the exact procedure of the Simplex Method for Bounded Variables. 34

  35. Example 2 • max 3x1 + 5x2 + 2x3 min 3x1 5x2 2x3, • s.t. • x1 + x2 + 2x3 7, • 2x1 + 4x2 + 3x3 15, • 0 x1 4, 0 x2 3, 0 x3 3. 35

  36. Example 2 by Simplex Method for Bounded Variables • potential entering variable: x2 • bounded by upper bound 3 • define = u2-x2 = 3-x2 min 3x1 5x2 2x3, s.t. x1 + x2 + 2x3 7, 2x1 + 4x2 + 3x3 15, 0 x1 4, 0 x2 3, 0 x3 3. 36

  37. Example 2 by Simplex Method for Bounded Variables 37

  38. Example 2 by Simplex Method for Bounded Variables • x1 as the (potential) entering variable • s2 as the leaving variable • a pivot operation as in standard Simplex Method min 3x1 5x2 2x3, s.t. x1 + x2 + 2x3 7, 2x1 + 4x2 + 3x3 15, 0 x1 4, 0 x2 3, 0 x3 3. 38

  39. Example 2 by Simplex Method for Bounded Variables • which can be an entering variable? • can s1 be a leaving variable? Yes • can x1 be a leaving variable? Yes min 3x1 5x2 2x3, s.t. x1 + x2 + 2x3 7, 2x1 + 4x2 + 3x3 15, 0 x1 4, 0 x2 3, 0 x3 3. 39

  40. Example 2 by Simplex Method for Bounded Variables • when = 1.25, x1 reaches its upper bound 4 • replace x1 by and is a basic variable = 0 • result min 3x1 5x2 2x3, s.t. x1 + x2 + 2x3 7, 2x1 + 4x2 + 3x3 15, 0 x1 4, 0 x2 3, 0 x3 3. 40

  41. Example 2 by Simplex Method for Bounded Variables • . • a “normal” pivot operation with aij < 0 min 3x1 5x2 2x3, s.t. x1 + x2 + 2x3 7, 2x1 + 4x2 + 3x3 15, 0 x1 4, 0 x2 3, 0 x3 3. 41

  42. Example 2 by Simplex Method for Bounded Variables • minimum • z* = -20.75, x1* = 4, x2* = 1.75, x3* = 0 min 3x1 5x2 2x3, s.t. x1 + x2 + 2x3 7, 2x1 + 4x2 + 3x3 15, 0 x1 4, 0 x2 3, 0 x3 3. 42

More Related