1 / 30

במחיר של שניים: גישה אחידה לקירוב בעיות אופטימיזציה אחד

במחיר של שניים: גישה אחידה לקירוב בעיות אופטימיזציה אחד. Reuven Bar-Yehuda CS Technion IIT Slides and papers at: http://www.cs.technion.ac.il/~reuven. Example VC. Given a graph G=(V,E) penalty p v  Z for each v  V Min  p v ·x v S.t.: x v  {0,1} x v + x u  1  {v,u}  E.

fatima-byrd
Télécharger la présentation

במחיר של שניים: גישה אחידה לקירוב בעיות אופטימיזציה אחד

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. www.cs.technion.ac.il/~reuven במחיר של שניים: גישה אחידה לקירוב בעיות אופטימיזציהאחד • Reuven Bar-Yehuda • CS Technion IIT • Slides and papers at: • http://www.cs.technion.ac.il/~reuven

  2. www.cs.technion.ac.il/~reuven Example VC • Given a graph G=(V,E) penalty pv Z for each v  V • Min pv·xv • S.t.: xv {0,1} • xv + xu  1 {v,u} E

  3. www.cs.technion.ac.il/~reuven Linear Programming (LP) Integer Programming (IP) • Given a profit [penalty] vector p. • Maximize[Minimize]p·x • Subject to: Linear Constraints F(x) • IP: where “x is an integer vector” is a constraint

  4. www.cs.technion.ac.il/~reuven Example VC • Given a graph G=(V,E) and penalty vector p Zn • Minimizep·x • Subject to: x {0,1}n • xi + xj  1 {i,j} E

  5. www.cs.technion.ac.il/~reuven Example SC • Given a Collection S1, S2,…,Sn of all subsets of {1,2,3,…,m} and penalty vector p Zn • Minimizep·x • Subject to: x {0,1}n • xi  1 j=1..m • j Si S1 1 2 S2 3 S3 m Sn

  6. www.cs.technion.ac.il/~reuven Example Min Cut • Given Network N(V,E) s,t  V and capacity vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe  1 st path P • e P

  7. www.cs.technion.ac.il/~reuven Example Shortest Path • Given digraph G(V,E) s,t  V and length vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe  1 st cut P • e P

  8. www.cs.technion.ac.il/~reuven Example MST (Minimum Spanning Tree) • Given graph G(V,E) s,t  V and length vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe  1 cut P • e P

  9. www.cs.technion.ac.il/~reuven Example Minimum Steiner Tree • Given graph G(V,E) TV and length vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe  1 T’s cut P • e P

  10. www.cs.technion.ac.il/~reuven Example Generalized Steiner Forest • Given graph G(V,E) T1T1…Tk  V • and length vector p Z|E| • Minp·x • S.t.: x {0,1}|E| • xe  1 i Ti’s cut P • e P

  11. www.cs.technion.ac.il/~reuven Example IS (Maximum Independent Set) • Given a graph G=(V,E) and profit vector p Zn • Maximaizep·x • Subject to: x {0,1}n • xi + xj  1 {i,j} E

  12. www.cs.technion.ac.il/~reuven Maximum Independent Set in Interval Graphs • Activity9 • Activity8 • Activity7 • Activity6 • Activity5 • Activity4 • Activity3 • Activity2 • Activity1 • time • Maximize s.t.For each instance I: • For each time t:

  13. www.cs.technion.ac.il/~reuven The Local-Ratio Technique: Basic definitions • Given a profit [penalty] vector p. • Minimize[Maximize]p·x • Subject to: feasibility constraints F(x) • x isr-approximationif F(x) and p·x r·p·x* • An algorithm is r-approximationif for any p, F • it returns an r-approximation

  14. www.cs.technion.ac.il/~reuven The Local-Ratio Theorem: • xis an r-approximation with respect to p1 • xis an r-approximation with respect to p- p1 •  • xis an r-approximation with respect to p • Proof: (For minimization) • p1 · x  r ×p1* • p2 · x  r ×p2* •  • p · x  r ×( p1*+ p2*) •  r ×(p1 + p2 )*

  15. www.cs.technion.ac.il/~reuven Special case: Optimization is 1-approximation • xis an optimum with respect to p1 • xis an optimum with respect to p- p1 • xis an optimum with respect to p

  16. www.cs.technion.ac.il/~reuven A Local-Ratio Schema for Minimization[Maximization] problems: • Algorithm r-ApproxMin[Max]( Set, p ) • If Set = Φ then returnΦ ; • If I  Setp(I)=0 then return {I} r-ApproxMin( Set-{I}, p ) ; • [If  I  Setp(I)  0 then returnr-ApproxMax( Set-{I}, p ) ;] • Define “good” p1 ; • REC = r-ApproxMax[Min]( Set, p- p1) ; • If REC is not an r-approximation w.r.t. p1 then “fix it”; • return REC;

  17. www.cs.technion.ac.il/~reuven The Local-Ratio Theorem: Applications Applications to some optimization algorithms (r = 1): ( MST) Minimum Spanning Tree (Kruskal) ( SHORTEST-PATH) s-t Shortest Path (Dijkstra) (LONGEST-PATH) s-t DAG Longest Path (Can be done with dynamic programming) (INTERVAL-IS) Independents-Set in Interval Graphs Usually done with dynamic programming) (LONG-SEQ) Longest (weighted) monotone subsequence (Can be done with dynamic programming) ( MIN_CUT) Minimum Capacity s,t Cut (e.g. Ford, Dinitz) Applications to some 2-Approximation algorithms: (r = 2) ( VC) Minimum Vertex Cover (Bar-Yehuda and Even) ( FVS) Vertex Feedback Set (Becker and Geiger) ( GSF) Generalized Steiner Forest (Williamson, Goemans, Mihail, and Vazirani) ( Min 2SAT) Minimum Two-Satisfibility (Gusfield and Pitt) ( 2VIP) Two Variable Integer Programming (Bar-Yehuda and Rawitz) ( PVC) Partial Vertex Cover (Bar-Yehuda) ( GVC) Generalized Vertex Cover (Bar-Yehuda and Rawitz) Applications to some other Approximations: ( SC) Minimum Set Cover (Bar-Yehuda and Even) ( PSC) Partial Set Cover (Bar-Yehuda) ( MSP) Maximum Set Packing (Arkin and Hasin) Applications Resource Allocation and Scheduling: ….

  18. www.cs.technion.ac.il/~reuven The creative part…find -Effective weights • p1 is -Effective if every feisible solution is -approx w.r.t. p1 • i.e. p1 ·x  p1* • VC (vertex cover) • Edge • Matching • Greedy • Homogeneous

  19. www.cs.technion.ac.il/~reuven VC: Recursive implementation (edge by edge) • VC (V, E, p) • If E= return ; • If p(v)=0 return {v}+VC(V-{v}, E-E(v), p); • Let (x,y)E; • Let  = min{p(x), p(y)}; • Define p1(v) =  if v=x or v=y and 0 otherwise; • Return VC(V, E, p- p1) 0 0 0   0 0 0

  20. www.cs.technion.ac.il/~reuven VC: Iterative implementation (edge by edge)  • VC (V, E, p) • for each e  E; • let  = min{p(v)| v  e}; • for each v  e • p(v) = p(v) - ; • return {v| p(v)=0}; 0 0 0   0 0 0

  21. 8 12 5 20 10 6 www.cs.technion.ac.il/~reuven Min 5xBisli+8xTea+12xWater+10xBamba+20xShampoo+15xPopcorn+6xChocolate s.t. xShampoo + xWater  1 15

  22. www.cs.technion.ac.il/~reuven Movie:1 4 the price of 2 

  23. www.cs.technion.ac.il/~reuven VC: Iterative implementation (edge by edge)  • VC (V, E, p) • for each e  E; • let  = min{p(v)| v  e}; • for each v  e • p(v) = p(v) - ; • return {v| p(v)=0}; 30 15 90 10 50 100 80 2

  24. www.cs.technion.ac.il/~reuven VC: Greedy ( O(H()) - approximation)H()=1/2+1/3+…+1/ = O(ln ) • Greedy_VC (V, E, p) • C = ; • while E • let v=arc min p(v)/d(v) • C = C + {v}; • V = V – {v}; • return C; n/ n/4 n/3 n/2 n … …

  25. www.cs.technion.ac.il/~reuven VC: LR-Greedy (star by star) • LR_Greedy_VC (V, E, p) • C = ; • while E • let v=arc min p(v)/d(v) • let  = p(v)/d(v); • C = C + {v}; • V = V – {v}; • for each u  N(v) • p(v) = p(v) - ; • return C;   4  

  26. www.cs.technion.ac.il/~reuven VC: LR-Greedy by reducing 2-effective homogeniousHomogenious = all vertices have the same “greedy value” • LR_Greedy_VC (V, E, p) • C = ; • Repeat • Let  = Min p(v)/d(v); • For each v  V • p(v) = p(v) –  d(v); • Move from V to C all zero weight vertices; • Remove from V all zero degree vertices; • Until E= • Return C; 3 4 6 4 3 5 3 2

  27. www.cs.technion.ac.il/~reuven Example MST (Minimum Spanning Tree) • Given graph G(V,E) s,t  V and length vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe  1 cut P • e P

  28. www.cs.technion.ac.il/~reuven MST: Recursive implementation (Homogenious) • MST (V, E, p) • If V= return ; • If self-loop e return MST(V, E-{e}, p); • If p(e)=0 return {e}+MST(Vshrink(e), Eshrink(e), p); • Let  = min{p(e) : eE}; • Define p1(e) =  for all eE; • Return MST(V, E, p- p1)

  29. www.cs.technion.ac.il/~reuven MST: Iterative implementation (Homogenious) • MST (V, E, p) • Kruskal

  30. www.cs.technion.ac.il/~reuven Some effective weights

More Related