1 / 22

Optimization Methods for the Single-Machine Problem

Optimization Methods for the Single-Machine Problem. Chapter 3 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha. Outline. Introduction Adjacent pairwise interchange methods A dynamic programming approach Dominance property A branch and bound approach

Télécharger la présentation

Optimization Methods for the Single-Machine Problem

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization Methods for the Single-Machine Problem Chapter 3 Elements of Sequencing and Schedulingby Kenneth R. Baker Byung-Hyun Ha

  2. Outline • Introduction • Adjacent pairwise interchange methods • A dynamic programming approach • Dominance property • A branch and bound approach • A mixed integer programming approach • from Baker and Trietsch, 2009 • Summary

  3. Introduction • Different scheduling procedures for different measures • F-problem and L-problem by SPT sequencing • U-problem by Algorithm 1 of Ch. 1 • T-problem by ?? • General purpose optimization methods • Dynamic programming approach • Branch and bound approach • Mathematical programming approach • ...

  4. Introduction • A general setting • gj(t) -- a penalty function, which is incurred when job j completes at time t • assuming nondecreasing with regard to t • Typical scheduling problems • Maximum penalty problem • to minimize the maximum gj(t) • Total penalty problem • to minimize the sum of gj(t) • Theorem 1 • When the objective is to minimize the maximum penalty, job i may be assigned the last position in sequence if gi(P)  gk(P) for all jobs k  i, where P is the total processing time of jobs. • Application of Theorem 1 • Tmax-problem -- gj(t) = max{0, t – dj}

  5. Adjacent Pairwise Interchange Methods • Thrust of adjacent pairwise interchange methods • Finding a sequence of which every adjacent pairwise interchange leads to poorer performance • F-problem  optimal by the method • T-problem  ?? • Three-job example local optimum 1-3-2 3-1-2 T = 5 T = 4 1-2-3 3-2-1 T = 4 T = 5 2-1-3 2-3-1 T = 3 T = 4

  6. Dynamic Programming Approach • Additive form of regular measures • Z = j=1n gj(Cj) • gj(Cj) = max{0, Cj – dj} -- total tardiness • We can find an optimal sequence with a dynamic programming • Notation • X -- set of all jobs, J -- subset of X • p(J) -- total time required to process jobs in J • Dynamic programming formulation • G(J) = minjJ{G(J – {j}) + gj(p(J))}, where G() = 0 • minimum penalty for the subproblem consisting of the jobs in J • gj(p(J)) -- penalty incurred by job j • G(J – {j}) -- minimum penalty incurred by remaining jobs • Then, minimum total penalty will be G(X). J X – J ... ... j p(J)

  7. Dynamic Programming Approach • Example -- X = {1, 2, 3}, gj(Cj) = max{0, Cj – dj} • G(X) = G({1,2,3}) = minj{1,2,3}{G({1,2,3} – {j}) + gj(p({1,2,3}))} = min{G({2,3}) + g1(p({1,2,3})), G({1,3}) + g2(p({1,2,3})), G({1,2}) + g3(p({1,2,3}))} = min{2 + 2, 0 + 4, 0 + 3} = 3  Sequence: 2-1-3 • G({2,3}) = minj{2,3}{G({2,3} – {j}) + gj(p({2,3}))} = min{G({3}) + g2(p({2,3})), G({2}) + g3(p({2,3}))} = min{0 + 3, 0 + 2} = 2 • G({1,3}) = minj{1,3}{G({1,3} – {j}) + gj(p({1,3}))} = min{G({3}) + g1(p({1,3})), G({1}) + g3(p({1,3}))} = min{0 + 0, 0 + 1} = 0 • G({1,2}) = minj{1,2}{G({1,2} – {j}) + gj(p({1,2}))} = min{G({2}) + g1(p({1,2})), G({1}) + g2(p({1,2}))} = min{0 + 0, 0 + 1} = 0 • G({1}) = minj{1}{G({1} – {j}) + gj(p({1}))} = G() + g1(p({1})) = g1(p1) = 0 • G({2}) = minj{2}{G({2} – {j}) + gj(p({2}))} = G() + g2(p({2})) = g2(p2) = 0 • G({3}) = minj{3}{G({3} – {j}) + gj(p({3}))} = G() + g3(p({3})) = g3(p3) = 0

  8. Dynamic Programming Approach • Computational efforts • In proportion to n2n • U-problem -- O(nlogn) • All enumerations -- in proportion to n! • Exercise • X = {1, 2, 3, 4}, gj(Cj) = max{0, Cj – dj}

  9. Dynamic Programming Approach • Computer implementation • Labeling scheme • lk = 2k-1, lJ = kJlk • Overall procedure 1. Set b(i) = 0 for i = 1..|X|, G(0) = 0 2. Loop 2-1. Find smallest integer j for which b(j) = 0 • If all b(j) = 1, then stop 2-2. Set b(j) = 1 2-3. For all i < j, set b(i) = 0 2-4. Let J = {j | b(j) = 1} 2-5. Set G(lJ) = minjJ{G(lJ – {j}) + gj(p(J))}

  10. Dominance Properties • Dominance properties involving schedules • Schedules without preemption and inserted idle times constituted dominant set • They reduces the number of alternatives • Dominance properties involving jobs • Theorem 2 • In the Tw-problem, suppose that there exists a job k for which dk p(X). Then job k may be assigned the last position in sequence. • Theorem 3 (Emmons, 1969) • In the T-problem, there is an optimal schedule in which job j follows job i if one of the following conditions is satisfied: (a) pi pjand di max{dj, p(Bj) + pj} (b) di djand dj p(Ai') – pj (c) dj p(Ai') where • Ai -- set of jobs to follow job i in an optimal solution; i.e., “after” set • Ai' -- complement set of Ai; i.e., Ai' = X – Ai • Bi-- set of jobs to precede job i in an optimal solution; i.e., “before” set

  11. Dominance Properties • Computer implementation • Modified labeling scheme • Nj = {i | i < j} • lj' = lNj+ 1 – lBjNj, lJ' = kJlk' • Overall procedure 1. Set b(i) = 0 for i = 1..|X|, G(0) = 0 2. Loop begin 2-1. Find smallest integer j for which b(j) = 0 • If all b(j) = 1, then stop 2-2. Set b(j) = 1 2-3. For all i < j, if j Ai , b(j) = 0 2-4. Let J = {j | b(j) = 1} 2-5. Set G(lJ') = minjJ{G(lJ – {j}') + gj(p(J))} • Example of p. 3.12 -- 5 jobs • A1 = {2}, A2 = , A3 = {4, 5}, A4 = , A5 =  l1' = 1, l2' = 1, l3' = 3, l4' = 3, l5' = 6  Number of candidates are not 25 = 32 but 14!

  12. Branch and Bound Approach • Iterations of two fundamental procedures • Branching and bounding • Branching • Partitioning a large problem into subproblems, i.e., replacing an original problem by a set of new problems that are (a) mutually exclusive and exhaustive subproblems of the original, (b) partially solved version of the original, and (c) smaller problems than the original P(0) ... P(1) P(2) P(n) ... ... ... P(12) P(32) P(n2) ... ... ... P(S)

  13. Branch and Bound Approach • Bounding • Curtailing enumeration process, by calculating a lower bound on the optimal solution of a given subproblem • Suppose Z is known performance measure and lower bound of a certain subproblem is not less than Z, then the subproblem need not be considered any further. • Fathomed branches • Branches of a subproblem that, no matter how the remainder of the subproblem is solved, the resulting solution can never have a value better than known solution • Active subproblems • Subproblems that have been encountered in branching process but that have not been eliminated by dominance properties and whose own subproblems have not yet been generated • Termination condition • A solution appears at the head of active list, which is a list of active subproblems ranked by lower bound

  14. Branch and Bound Approach • Example: a branch and bound procedure for T-problem • Notation • s -- partial sequence of jobs from among n jobs originally in problem • js -- partial sequence in which s is immediately preceded by job j • s' -- the complement of s • p(s) = js'pj ; p(js) = p(s) – pj • P(s) -- subproblem with a sequence which ends with partial sequence s • P(0) -- original problem • vs = jsTj ; vjs = max{0, p(s) – dj} + vs • bs -- lower bound of subproblem P(s)

  15. Branch and Bound Approach • Example: a branch and bound procedure for T-problem (cont’d) • Algorithm 1 -- Branch and Bound 1. (Initialization) Place P(0) on the active list. Set v0 = 0 and p() = jpj . 2. Remove the first subproblem, P(s), from the active list.Let k denote the number of jobs in the partial sequence s.If k = n, stop: the complete sequence s is optimal.Otherwise test Theorem 2 for P(s). If the property holds, go to Step 3; otherwise go to Step 4. 3. Let job j be the job with the latest due date in s'. Create the subproblem P(js) with p(js) = p(s) – pj ; vjs = vs ; and bjs = vs Place P(js) on the active list, ranked by its lower bound.Return to Step 2. 4. Create (n – k) subproblem P(js), one for each j in set s'. For P(js), let p(js) = p(s) – pj ; vjs = vs + p(s) – dj ; and bjs = vjs Now place each P(js) on the active list, ranked by its lower bound bjs .Return to Step 2.

  16. Branch and Bound Approach • Example -- T-problem P(0) 7 5 1 P(1) P(2) P(3) P(4) P(5) 13 12 10 10 1 6 3 2 8 P(53) P(54) P(15) P(25) P(35) P(45) 10 10 12 11 9 9 9 4 P(153) P(253) P(453) P(154) P(254) P(354) P(135) P(235) P(435) P(145) P(245) P(345) 14 13 11 19 18 16 13 12 10 18 17 15 10 P(1453) P(2453) P(1435) P(2435) 15 16 12 11 11 P(12435) 11

  17. Branch and Bound Approach • A branch and bound procedure for T-problem (cont’d) • Options • Lower bounds • bs = vs , or • bs = vs + minjs'{max{0, p(s) – dj}}, or • ... • Trial solutions for bounding • A schedule obtained during branching, or • One obtained by pursuing tree to bottom as rapidly as possible, or • One by heuristics, such as MDD rule, or • ... • Branching • Jumptracking, or • backtracking, or • ... • ...

  18. Mixed Integer Programming Approach • Example: T-problem by sequence-position decisions • Notation • n -- number of jobs • pj, dj -- processing time and due date of job j • xjk = 1, if job j is assigned to kth position in sequence; 0, otherwise • tj -- tardiness of job j • Formulation • Minimize • j=1ntj • Subject to • j=1nxjk = 1,  positions k • k=1nxjk = 1,  jobs j • j=1npju=1kxju – j=1ndj xjk tk ,  positions k • xjk= 0 or 1,  jobs j,  positions k

  19. Mixed Integer Programming Approach • Example: T-problem by sequence-position decisions (cont’d) • Instantiation of T-problem with 3 jobs • Minimize • t1 + t2 + t3 • Subject to • x11 + x21 + x31 = 1, x12 + x22 + x32 = 1, x13 + x23 + x33 = 1 • x11 + x12 + x13 = 1, x21 + x22 + x23 = 1, x31 + x32 + x33 = 1 • 1x11 + 2x21 + 3x31 – (4x11 + 2x21 + 3x31)  t1 • 1(x11 + x12)+ 2(x21 + x22) + 3(x31 + x32) – (4x12 + 2x22 + 3x32)  t2 • 1(x11 + x12 + x13)+ 2(x21 + x22 + x23) + 3(x31 + x32 + x33) – (4x13 + 2x23 + 3x33)  t3 • x11, x21, x31, x12, x22, x32, x13, x23, x33 = 0 or 1 • A solution (may not optimal) • x11 = 0, x21 = 0, x31 = 1, x12 = 1, x22 = 0, x32 = 0, x13 = 0, x23 = 1, x33 = 0

  20. Mixed Integer Programming Approach • Example: T-problem by precedence decisions • Notation • n -- number of jobs • pj , dj -- processing time and due date of job j • yij = 1, if job i is scheduled before job j in sequence; 0, otherwise, for jobs ij • sj , tj – start time and tardiness of job j • M -- big number • Formulation • Minimize • j=1ntj • Subject to • si + pi  sj + M(1 – yij),  jobs i  j • sj + pj  si + Myij,  jobs i  j • sj + pj – dj tj ,  jobs j • xij= 0 or 1, jobs i  j • Exercise: instantiate T-problem with 3 jobs.

  21. Team Assignment • TEAM HOMEWORK #1 • Implement a dynamic programming approach for solving T-problem. • TEAM HOMEWORK #2 • Implement a branch and bound approach for solving T-problem. • TEAM HOMEWORK #3 • Solve T-problem using commercial MIP (mixed integer problem) solver.

  22. Summary • Sequencing and scheduling, notoriously difficult problems • Relatively few situations that can be analyzed by special structure • General purpose techniques for optimal solutions, in this chapter • Heuristic method for relatively good solutions, in the next chapter • Some options for efficiency of general methods • Dynamic programming approach • Efficient computer implementation • Labeling schemes and set generation algorithms • Dominance properties • Tw-problem up to 30 jobs (Schrage and Baker, 1978) • T-problem up to 100 jobs (Potts and Van Wassenhove, 1982) • Branch and bound approach • Lower bound calculation • Initial trial solutions • Dominance check • Branching mechanism • Mixed integer programming approach

More Related