1 / 111

Case Studies: Bin Packing & The Traveling Salesman Problem

Case Studies: Bin Packing & The Traveling Salesman Problem. David S. Johnson AT&T Labs – Research. Outline. Lecture 1: Today Introduction to Bin Packing (Problem 1) Introduction to TSP (Problem 2) Lecture 2: Today -- TSP Lecture 3: Friday --TSP Lecture 4: Friday –- Bin Packing.

mikel
Télécharger la présentation

Case Studies: Bin Packing & The Traveling Salesman Problem

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Case Studies: Bin Packing &The Traveling Salesman Problem David S. Johnson AT&T Labs – Research

  2. Outline • Lecture 1: Today • Introduction to Bin Packing (Problem 1) • Introduction to TSP (Problem 2) • Lecture 2: Today -- TSP • Lecture 3: Friday --TSP • Lecture 4: Friday –- Bin Packing

  3. Applications • Packing commercials into station breaks • Packing files onto floppy disks (CDs, DVDs, etc.) • Packing MP3 songs onto CDs • Packing IP packets into frames, SONET time slots, etc. • Packing telemetry data into fixed size packets Standard Drawback: Bin Packing is NP-complete

  4. NP-Hardness • Means that optimal packings cannot be constructed in worst-case polynomial time unless P = NP. • In which case, all other NP-hard problems can also be solved in polynomial time: • Satisfiability • Clique • Graph coloring • Etc. • See M.R. Garey & D.S.Johnson, Computers and Intractability: A Guide to the Theory of NP-Completeness, W.H.Freeman, 1979.

  5. Standard Approach to Coping with NP-Hardness: • Approximation Algorithms • Run quickly (polynomial-time for theory, low-order polynomial time for practice) • Obtain solutions that are guaranteed to be close to optimal

  6. First Fit (FF):Put each item in the first bin with enough space Best Fit (BF):Put each item in the bin that will hold it with the least space left over First Fit Decreasing, Best Fit Decreasing (FFD,BFD): Start by reordering the items by non-increasing size.

  7. Worst-Case Bounds • Theorem [Ullman, 1971][Johnson et al.,1974]. For all lists L, BF(L), FF(L) ≤ (17/10)OPT(L) + 3. • Theorem [Johnson, 1973]. For all lists L, BFD(L), FFD(L) ≤ (11/9)OPT(L) + 4. (Note 1: 11/9 = 1.222222…) (Note 2: These bounds are asymptotically tight.)

  8. Lower Bounds: FF and BF ½ -  ½ -  ½ -  ½ +  ½ +  OPT: N bins FF, BF: N/2 bins + N bins = 1.5 OPT

  9. Lower Bounds: FF and BF 1/6 - 2 1/6 - 2 1/6 - 2 1/6 - 2 1/6 - 2 1/6 - 2 1/6 - 2 1/3 + 1/3 + 1/3 + ½ +  1/2 + OPT: N bins FF, BF: N/6 bins + N/2 bins + N bins = 5/3 OPT

  10. Lower Bounds: FF and BF 1/43 + , 1/1806 + , etc. 1/7 +  1/3 + FF, BF = N(1 + 1/2 + 1/6 + 1/42 + 1/1805 + … )  (1.691..) OPT 1/2 + OPT: N bins

  11. Jeff Ullman’s Idea FF, BF = N (1 + 1/2 + 1/5) = (17/10) OPT 1/6 +  Key modification: Replace some 1/3 +’s & 1/6+’s with 1/3-i & 1/6+i, and some with 1/3+i & 1/6-i, with i increasing from bin to bin. (Actually yields FF, BF ≥ (17/10)OPT - 2 See References for details.) 1/3 + 1/2 + OPT: N bins

  12. Lower Bounds: FFD and BFD 1/4-2 1/4-2 1/4-2 1/4-2 1/4-2 1/4-2 1/4-2 1/4+ 1/4+ 1/4+ 1/4+ ½ +  ½ +  1/4+2 1/4+2 1/4+2 6n bins 3n bins 6n bins 2n bins 3n bins OPT = 9n FFD, BFD = 11n

  13. Asymptotic Worst-Case Ratios • RN(A) = max{A(I)/OPT(I): OPT(I) = N} • R(A) = max{RN(A): N > 0} • absolute worst-case ratio • R∞(A) = limsupN∞RN(A) • asymptotic worst-case ratio • Theorem: R∞(FF) = R∞(BF) = 17/10. • Theorem: R∞(FFD) = R∞(BFD) = 11/9.

  14. Why Asymptotics? • Partition Problem: Given a set A of numbers ai, can they be partitioned into two subsets with the sums (aAa)/2? • This problem is NP-hard, and is equivalent to the special case of bin packing in which we ask if OPT = 2. • Hence, assuming P  NP, no polynomial-time bin packing algorithm can have R(A) < 3/2.

  15. Implementing Best Fit • Put the unfilled bins in a binary search tree, ordered by the size of their unused space (choosing a data structure that implements inserts and deletes in time O(logN)). Bin Index Gap Size Bin with smaller gap Bin with larger gap

  16. Implementing Best Fit • When an item x arrives, the initial “current node” is the root of the tree. • While the item x is not yet packed, • If x fits in the current node’s gap, • If x fits in the gap of the node’s left child, let that node become the current node. • Otherwise, • delete the current node. • pack x in the current node’s bin. • reinsert that bin into the tree with its new gap. • Otherwise, • If the current node has a right child, let that become the current node. • Otherwise, pack x in a new bin, and add that to the tree.

  17. Implementing First Fit • New tree data structure: • The bins of the packing are the leaves, in their original order. • Each node contains the maximum gap in bins under it. 1.00 .99 1.00 .56 .99 .77 .41 .56 .82 .99 .26 .77 .11 .41 .56 .37 .82 .03 .50 .99 .26 .17 .77 .44 1.00 1.00 ●●●

  18. Online Algorithms (+ Problem 1) • We have no control over the order in which items arrive. • Each time an item arrives, we must assign it to a bin before knowing anything about what comes later. • Examples: FF and BF • Simplest example: Next Fit (NF) • Start with an empty bin • If the next item to be packed does not fit in the current bin, put it in a new bin, which now becomes the “current” bin. Problem 1: Prove R∞(NF) = 2.

  19. Best Possible Online Algorithms • Theorem [van Vliet, 1996]. For any online bin packing algorithm A, R∞(A) ≥ 1.54. • Theorem [Richey, 1991]. There exist polynomial-time online algorithms A with R∞(A) ≤ 1.59. • Drawback: Improving the worst-case behavior of online algorithms tends to guarantee that their average-case behavior will not be much better. • FF and BF are much better on average than they are in the worst case.

  20. Best Possible Offline Algorithm? • Theorem [Karmarkar & Karp, 1982]. There is a polynomial-time (offline) bin packing algorithm KK that guarantees KK(L) ≤ OPT(L) + log2(OPT(L)) . • Corollary.R∞(KK) = 1. • Drawback: Whereas FFD and BFD can be implemented to run in time O(NlogN), the best bound we have on the running time of KK is O(N8log3N). • Still open: Is there a polynomial-time bin packing algorithm A that guarantees A(L) ≤ OPT(L) + c for any fixed constant c?

  21. To Be Continued… • Next time: Average-Case Behavior • For now: On to the TSP!

  22. The Traveling Salesman Problem Given: Set of cities {c1,c2,…,cN }. For each pair of cities {ci,cj}, a distance d(ci,cj). Find: Permutation that minimizes

  23. N = 10

  24. N = 10

  25. Other Types of Instances • X-ray crystallography • Cities: orientations of a crystal • Distances: time for motors to rotate the crystal from one orientation to the other • High-definition video compression • Cities: binary vectors of length 64 identifying the summands for a particular function • Distances: Hamming distance (the number of terms that need to be added/subtracted to get the next sum)

  26. No-Wait Flowshop Scheduling • Cities: Length-4 vectors <c1,c2,c3,c4> of integer task lengths for a given job that consists of tasks that require 4 processors that must be used in order, where the task on processor i+1 must start as soon as the task on processor i is done). • Distances: d(c,c’) = Increase in the finish time of the 4th processor if c’ is run immediately after c. • Note: Not necessarily symmetric: may have d(c,c’)  d(c’,c).

  27. How Hard? • NP-Hard for all the above applications and many more • [Karp, 1972] • [Papadimitriou & Steiglitz, 1976] • [Garey, Graham, & Johnson, 1976] • [Papadimitriou & Kanellakis, 1978] • …

  28. How Hard? Number of possible tours: N! = 123(N-1)N = Θ(2NlogN) 10! = 3,628,200 20! ~ 2.431018 (2.43 quadrillion) Dynamic Programming Solution: O(N22N) = o(2NlogN)

  29. Dynamic Programming Algorithm • For each subset C’ of the cities containing c1, and each city cC’, let f(C’,c) = Length of shortest path that is a permutation of C’, starting at c1 and ending at c. • f({c1}, c1) = 0 • For xC’,f(C’{x},x) = MincC’f(C’,c) + d(c,x). • Optimal tour length = MincCf(C,c) + d(c, c1). • Running time: ~(N-1)2N-1 itemsto be computed, at time N for each = O(N22N)

  30. How Hard? Number of possible tours: N! = 123(N-1)N = Θ(2NlogN) 10! = 3,628,200 20! ~ 2.431018 (2.43 quadrillion) Dynamic Programming Solution: O(N22N) 102210 = 102,400 202220 = 419,430,400

  31. N = 10

  32. N = 100

  33. N = 1000

  34. N = 10000

  35. Planar Euclidean Application #1 • Cities: • Holes to be drilled in printed circuit boards

  36. N = 2392

  37. Planar Euclidean Application #2 • Cities: • Wires to be cut in a “Laser Logic” programmable circuit

  38. N = 7397

  39. N = 33,810

  40. N = 85,900

  41. Standard Approach to Coping with NP-Hardness: • Approximation Algorithms • Run quickly (polynomial-time for theory, low-order polynomial time for practice) • Obtain solutions that are guaranteed to be close to optimal • For the latter and the TSP, we need the triangle inequality to hold: d(a,c) ≤ d(a,b) + d(b,c)

  42. Danger when No -Inequality • Theorem[Karp, 1972]: Given a graph G = (V,E),it is NP-hard to determine whether G contains a Hamiltonian circuit (collections of edges that make up a tour). • Given a graph, construct a TSP instance in which d(c,c’) = • If Hamiltonian circuit exists, OPT = N, if not, OPT > N2N. • A polynomial-time approximation algorithm with that guaranteed a tour of length no more than 2N OPT would imply P = NP. { 1 if {c,c’}  E N2N if {c,c’}  E

  43. Nearest Neighbor (NN): Start with some city. Repeatedly go next to the nearest unvisited neighbor of the last city added. When all cities have been added, go from the last back to the first.

More Related