720 likes | 988 Vues
Introduction to Integer Programming Modeling and Methods. Michael Trick Carnegie Mellon University CPAI-OR School, Le Croisic 2002. Some History. Integer Programming goes back a long way:
E N D
Introduction to Integer Programming Modeling and Methods Michael Trick Carnegie Mellon University CPAI-OR School, Le Croisic 2002
Some History • Integer Programming goes back a long way: • Schrijver takes it back to ancient times (linear diophantine equations), Euler (1748), Monge (1784) and much more. • “Proper” study began in the 1950s • Dantzig (1951): linear programming • Gomery (1958): cutting planes • Land and Doig (1960): branch and bound • Survey books practically every five years since • Tremendous practical success in last 10-15 years
Scope • This talk will not be comprehensive! • Attempt to get across main concepts of integer programming • Relaxations • Primal Heuristics • Branch and Bound • Cutting Planes
Integer Program (IP) Linear objective Minimize cx Subject to Ax=b l<=x<=u some or all of xj integral X: variables Linear constraints Makes things hard!
Rules of the Game • Must put in that form! • Seems limiting, but 50 years of experience gives “tricks of the trade” • Many formulations for same problem
Example Formulations Warehouse location: n stores, m possible warehouses; cost k[j] to open warehouse m; cost c[i,j] to handle store i out of warehouse j. Minimize the total opening costs plus handling costs Subject to Each store assigned to one open warehouse
Warehouse Formulation Variables x[i,j] = 1 if store i served by warehouse j; 0 otherwise y[j] = 1 if warehouse j open; 0 otherwise Objective Minimize sum_j k[j]y[j]+sum_i,j c[i,j]x[i,j]
Warehouse Formulation • Constraints: sum_j x[i,j] = 1 for all i sum_i x[i,j] <= ny[j] for all j
Binary Integer Programs • Restrict variables to be 0-1 • Many specialized methods • OR people are real good at formulating difficult problems within these restrictions
Key concepts • Relaxation R(IP) • “easily” solved problem such that • Optimal solution value is no more than that of IP • If solution is feasible for IP then is optimal for IP • If R(IP) infeasible then so is IP • Most common is “linear relaxation”: drop integrality requirements and solve linear program • Others possible: lagrangian relaxation, lagrangian decomposition, bounds relaxation, etc.
Why this fetish with Linear Relaxations? • IP people are very focused on linear relaxations. Why? • Sometimes linear=integer • Linear relaxations as global constraints • Duals and reduced costs
Linear=integer formulations • Happens naturally for some problems • Network flows • Totally unimodular constraint matrices • Takes more work, but defined for • Matchings • Minimum spanning trees • Closely associated with polynomial solvability
Duals and Reduced Costs Associated with the solution of a linear program are the dual values --- one per constraint --- measures the marginal value of changing the right-hand-side of the constraint --- Useful in many algorithmic ways
Dual example Sum_i x[i,j]-y[j] <= 0 Suppose facility j* has cost 10 and y*[j*] = 0. The dual value of this constraint is 4. What cost must facility j* have to be appealing? Answer: no more than 10-4=6.
Dual Example 2 • Products 1, 2, 3 use chemicals A, B Maximize 3x1+2x2+2x3 Subject to x1+x2+2x3 <= 10; (.667) 5x1+2x2+x3 <= 20; (.667) Solution: x2=6.67 x1=1.67 What objective must a product that uses 4 of A and 3 of B have to be appealing: at least 4.67
Final advantage of linear relaxations: Global • Linear relaxations are • Relatively easy to solve: huge advances in 15 years • Incorporate “global” information • Often provide good bounds and guidelines for integer program • Variables with very bad reduced cost likely not in optimal integer solution • Rounding doesn’t always work, but often gets good feasible solutions
Feasible solutions • Solutions that satisfy all the constraints but might not be optimal • Generally found by heuristics • Can be problem specific • Must have value greater than or equal to optimal value (for minimizing)
Feasible Solution Feasible solution
Fundamental Branch and Bound Algorithm Solve relaxation to get x* If infeasible, then IP infeasible Else If x* feasible to IP, then x* optimal to IP Else create new problems IP1 and IP2 by branching; solve recursively, stop if prove subproblem cannot be optimal to IP (bounding)
Branching • Create two or more subproblems IP1, IP2,… IPn such that • Every feasible solution to IP appears in at least one (often exactly one) of IP1, IP2, … IPn • x* is infeasible to each of R(IP1), R(IP2), … R(IPn) • For linear relaxation, can choose a fraction xj* and have one problem with xj <=[xj] and the other with xj >= [xj]+1 ([x]: round down of x)
Illustration IP1 x* IP2
Bounding • Along way, we may find solution x’ that is feasible to IP. If any subproblem has relaxation value c* >= cx’ then we can prune that subproblem: it cannot contain the optimal solution. There is no sense continuing on that subproblem.
Stopping • Technique can stop early with solution within a provable percentage of optimal (compare to be relaxation value) • Can also modify to generate all solutions (do not prune on ties)
How to make work better? • Better formulations • Better relaxations (cuts) • Better feasible solutions (heuristics)
Formulations • Different formulations of integer programs may act very differently: their relaxations might have radically different bounds • “Good Formulation” of integer program: provides a better relaxation value (all else being equal).
Back to Warehouse Example • Alternative formulation of “Only use if open constraint” x[i,j] <= y[j] for all i,j (versus) sum_i x[i,j] <= ny[j] • Which is better?
Comparing • Positives to original • Fewer constraints: linear relaxation should solve faster • Positives to disaggregate formulation • Much better bounds (consider having x[i,j]=1 for a particular i,j. What would y[j] be in the two formulations?) • (Almost) no comparison! Formulation with more constraints works much better.
Ideal Formulation gives convex hull of feasible integer points
Embarrassing Formulations • Some things are very hard to formulate with integer programming: • Traveling Salesman problem: great success story (IP approaches can optimize 15,000 city problems!), but best IP approaches begin with an exponentially sized formulation (no “good” compact formulation known). • Complicated operational requirements can be hard to formulate.
Further approaches • Branch and Price • Formulations with exponential number of variables with complexity in generating “good” variables (see Nemhauser): heavy use of dual values • Branch and Cut • Improving formulations by adding additional constraints to “cut off” linear relaxation solutions (more later)
Algorithmic Details Preprocessing Primal Heuristics Branching Cut Generation
Preprocessing • Process individual rows to • detect infeasibilities • detect redundancies • tighten right-hand-sides • tighten variable bounds • Probing: examine consequences of fixing 0-1 variable • If infeasible, fix to opposite bound • If other variables are fixed, inequalities
Preprocessing • Much like simple Constraint Programming
Improving Coefficients 3x1-2x2 1 1. Convert to with pos. coefficients with y1 = 1-x1 3y1 + 2x2 2 2. Note that constraint always satisfied when y1 = 1, so change coefficient 3 to 2 2y1 + 2x2 2 3. Convert back to original x1 -x2 1
Cuts off (1/2, 1/4) and others Improving (?) Coefficients 1 1
Modeling issue Automatic identification not foolproof Generally easy to see Can provide problem-knowledge to further reduce coefficients Automatic issue Many opportunities will only occur within Branch and Bound tree as variables are fixed “More foolproof” as models change Manual or Automatic?
Identifying Redundancy and Infeasibility Use upper and lower bounds on variables: • Redundancy 3x1 - 4x2+ 2x3 6 (max lhs is 5) • Infeasibility 3x1 - 4x2+ 2x3 6 (max lhs is 5) While very simple, can be used to fix variables, particularly within B&B tree
PP: Fixing Variables Simple idea: if setting a variable to a value leads to infeasibility, then must set to another value 3x1-4x2+2x3-3x4 3 Setting x4 to 1 leads to previous infeasible constraint, so x4 must be 0
PP: Implication Inequalities Many constraints embed restrictions that at most one of x and y (or their complements) are 1. This can lead to implication inequalities.
PP: Implication Inequalities Facility location x1+x2+…+xm mx0 x0= 0 x1 = 0 x0= 0 x2 = 0, etc. (1-x0) + x1 1 (or x1 x0) x2 x0 , etc. Automatic disaggregation (stronger!)
PP: Clique Inequalities These inequalities found by “probing” (fix variable and deduce implications). These simple inequalities can be strengthened by combining mutually exclusive variables into a single constraint. Resulting clique inequalities are very “strong” when many variables combined.
Example: Sports Scheduling Problem: Given n teams, find an “optimal” (minimum distance, equal distance, etc.) double round robin (every team plays at every other team) schedule. A: @B @C D B C @D B: A D @C @A @D C C: @D A B D @A @B D: C @B @A @C B A
Sports Scheduling Many formulations (not wholly satisfactory) One method: One variable for every “home stand” (series of home games) and “away trip” (series of away games).
Variables (team A) Some variables: y1 H x1 @B @C x2 @C @D y1 H H x3 @E @F 1 2 3 4
Constraints • Can only do one thing in a time slot y1+x1+x2 1 x1+x2+y2 1 • No “Away after Away” x1+x2+x3 1 • No “Home after Home” y1+y2 1 Additional constraints link teams
Improving Formulation Create Implication Graph y1 H x1 @B @C x2 @C @D y2 H H x3 @E @F 1 2 3 4
Find cliques • Cliques in graph: can only have one y1 H x1 @B @C x2 @C @D y2 H H x3 @E @F 1 2 3 4
Constraints • Can only do one thing in a time slot y1+x1+x2 1 x1+x2+y2 1 • No “Away after Away” x1+x2+x3 + y2 1 • No “Home after Home” y1+y2 + x1 + x2 1 Additional constraints link teams