1 / 23

Round and Approx: A technique for packing problems

Round and Approx: A technique for packing problems. Nikhil Bansal (IBM Watson) Maxim Sviridenko (IBM Watson) Alberto Caprara (U. Bologna, Italy). Problems. Bin Packing: Given n items, sizes s 1 ,…,s n , s.t. 0 < s i · 1. Pack all items in least number of unit size bins.

mandy
Télécharger la présentation

Round and Approx: A technique for packing problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Round and Approx: A technique for packing problems Nikhil Bansal (IBM Watson) Maxim Sviridenko (IBM Watson) Alberto Caprara (U. Bologna, Italy)

  2. Problems Bin Packing: Given n items, sizes s1,…,sn, s.t. 0 < si· 1. Pack all items in least number of unit size bins. D-dim Bin Packing (with & without rotations) 3 1 2 3 3 1 1 2 2 6 6 4 4 6 5 5 4 5

  3. Problems d-dim VectorPacking: Each item d-dim vector. Packing valid if each co-ordinate wise sum ·1 Set Cover: Items i1, … , in Sets C1,…,Cm. Choose fewest sets s.t. each item covered. All three bin packing problems, can be viewed as set cover. Sets implicit: Any subset of items that fit feasibly in a bin. Bin: machine with d resources Item: job with resource requiremts. Invalid Valid

  4. Short history of bin-packing Bin Packing: NP-Hard if need 2 or 3 bins? (Partition Prob.) Does not rule out Opt + 1 Asymptotic approximation:  OPT + O(1) Several constant factors in 60-70’s APTAS: For every  >0, (1+) Opt + O(1) [de la Vega, Leuker 81] Opt + O(log2 OPT) [Karmarkar Karp 82] Outstanding open question: Can we get Opt + 1 No worse integrality gap for a natural LP known

  5. Short history of bin-packing 2-d Bin Packing: APTAS ) P=NP [B, Sviridenko 04] Best Result: Without rotations: 1.691… [Caprara 02] With rotations: 2 [Jansen, van Stee 05] d-dim Vector Packing: No APTAS for d=2 [Woeginger 97] Best Result: O(log d) for constant d [Chekuri Khanna 99] If d part of input, d1/2 -) P=NP Best for d=2 is 2 approx.

  6. Our Results 1) 2-d Bin Packing : ln 1.691 + 1 = 1.52 Both with and without rotations (previously 1.691 & 2) 2) d-Dim Vector Packing: 1 + ln d (for constant d) For d=2: get 1+ ln 2 = 1.693 (previously 2)

  7. General Theorem Given a packing problem, items i1,…,in 1) If can solve set covering LP min C xC s.t. C: i 2 C xC¸ 1 8 items i 2)  approximation : Subset Oblivious Then (ln  + 1) approximation d subset oblivious approximation for vector packing 1.691 algorithm of Caprara for 2d bin packing is subset ob. Give 1.691 subset ob. approx for rotation case (new)

  8. Subset Oblivious Algorithms Given an instance I, with n items (I) = all 1’s vector (S) incidence vector for subset of items S. There exist k weight (n - dim) vectors w1, w2,…,wk For everysubset of items S µ I, and  > 0 1) OPT (I) ¸ maxi ( wi¢(I) ) 2) Alg (S) · maxi (wi¢(S)) +  OPT(I) + O(1)

  9. An (easy) example Any-Fit Bin Packing algorithm: Consider items one by one. If current item does not fit in any existing bin, put it in a brand new bin. No two bins filled · 1/2 (implies ALG ·2 OPT + 1 ) Also a subset oblivious 2 approx K=1: w(i) = si (size of item i) 1) OPT(I) ¸i 2 I si = w ¢(I) [Volume Bound] 2) Alg(S) ·2w ¢  (S) + 1 [ # bins ·2 ( total volume of S) + 1 ]

  10. Non-Trivial Example Asymptotic approx scheme of de la Vega, Leuker For any  > 0, Alg · (1+) OPT + O(1/2) We will show it is subset oblivious

  11. 1-d: Algorithm I 0 1

  12. 1-d: Algorithm bigs I 0  1

  13. 1-d: Algorithm Partition bigs into 1/2 = O(1) groups, with equal objects I 0  1 . . . I’ 0  1 I’ ¸ I

  14. 1-d: Algorithm Partition bigs into 1/2 = O(1) groups, with equal objects I 0  1 . . . I’ 0  1 I’ ¸ I I’ – { } · I I’ ¼ I I’ has only O(1/2) distinct sizes

  15. LP for the big items 1/2 items types. Let ni denote # of items of type i in instance. LP: min C xC s.t. C ai,C xC¸ ni8 size types i C indexes valid sets (at most (1/2)(1/) ) ai,C number of type i items in set C At most 1/2 variables non-zero. Rounding:x !d x e Solution (big) · Opt (big) + 1/2

  16. Filling in smalls Take solution on bigs. Fill in smalls (i.e. <) greedily. • If no more bins need, already optimum. • If needed, every bin (except maybe one) filled to 1- Alg(I) · Volume(I)/(1-) +1 · Opt/(1-) +1 We will now show this is a subset oblivious algorithm !

  17. Subset Obliviousness LP: min xC C ai,C xC¸ ni8 item types i Dual: max ni wi i ai,C wi· 1 for each set C If consider dual for subset of items S Dual: max |type i items in S| wi i ai,C wi· 1 for each set C Dual polytope independent of S: Only affects objective function.

  18. Subset Obliviousness LP: min xC C ai,C xC¸ ni8 item types i Dual: max ni wi i ai,C wi· 1 for each set C. Define vector Wv for each vertex of polytope (O(1) vertices) LP*(S) = maxvWv¢ (S) (LP Duality) Alg(S) · LP*(S) + 1/2 = maxvWv¢ (S) + 1/2 Opt(I) ¸ LP(I) = maxvWv¢ (I) Handling smalls: Another vector w, where w(i) = si

  19. General Algorithm Theorem: Can get ln  + 1 approximation, if 1) Can solve set covering LP 2)  approximate subset oblivious alg. Algorithm: Solve set covering LP, get soln x* . Randomized Rounding with parameter  > 0, i.e. choose set C independently with prob  xC* Residual instance: Apply subset oblivious approx.

  20. Proof of General Theorem After randomized rounding, Prob. element i left uncovered·e- Pf: Prob = C: i 2 C (1-  xC) · e- ( as C: i 2 C xC¸ 1 ) E ( wi¢(S)) ·e-wi¢(I) wi¢(S) sharply concentrated (variance small: proof omitted) maxi (wi¢ (S)) ¼ e- maxi (wi¢(I) ) · e- OPT(I) But subset oblivious algorithm implies Alg(S) · maxi (wi¢(S)) · e- OPT(I)

  21. Proof of General Algorithm Expected cost = Randomized Rounding + Residual instance cost ¼  LP cost +  e- Opt Gives  +  e- approximation Optimizing , gives 1 + ln  approx.

  22. Wrapping up d-dim vector packing: Partition Instance I into d parts I1,…,Id Ij consists of items for which jth dim is largest Solving Ij is just a bin packing problem 1+ for bin packing gives d+ subset oblivious algorithm 2-d bin Packing: Harder Framework for incorporating structural info. into set cover. Other Problems?

  23. Questions?

More Related