310 likes | 429 Vues
This seminar by Reuven Bar-Yehuda from Technion IIT delves into approximation algorithms for linear and integer programming optimization problems. The lecture covers various examples including Vertex Cover, Minimum Spanning Tree, and Minimum Cut, illustrating how approximation techniques can lead to effective solutions in complex scenarios. The focus will be on the local-ratio technique and its applications to optimization algorithms, demonstrating how to approach different problem formulations while minimizing penalties or maximizing profits. Access slides and papers at http://www.cs.technion.ac.il/~cs236813.
E N D
Seminar 236813: Approximationalgorithms for LP/IP optimization problems • Reuven Bar-Yehuda • Technion IIT • Slides and papers at: • http://www.cs.technion.ac.il/~cs236813
Example VC • Given a graph G=(V,E) penalty pv Z for each v V • Min pv·xv • S.t.: xv {0,1} • xv + xu 1 {v,u} E
Linear Programming (LP) Integer Programming (IP) • Given a profit [penalty] vector p. • Maximize[Minimize]p·x • Subject to: Linear Constraints F(x) • IP: where “x is an integer vector” is a constraints
Example VC • Given a graph G=(V,E) and penalty vector p Zn • Minimizep·x • Subject to: x {0,1}n • xi + xj 1 {i,j} E
Example SC • Given a Collection S1, S2,…,Sn of all subsetsof {1,2,3,…,m} and penalty vector p Zn • Minimizep·x • Subject to: x {0,1}n • xi 1 j=1..m • j Si
Example Min Cut • Given Network N(V,E) s,t V and capasity vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe 1 st path P • e P
Example Min Path • Given digraph G(V,E) s,t V and length vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe 1 st cut P • e P
Example MST (Minimum Spanning Tree) • Given graph G(V,E) and length vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe 1 cut P • e P
Example Minimum Steiner Tree • Given graph G(V,E) TV and length vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe 1 T’s cut P • e P
Example Generalized Steiner Forest • Given graph G(V,E) T1T1…Tk V • and length vector p Z|E| • Minp·x • S.t.: x {0,1}|E| • xe 1 i Ti’s cut P • e P
Example IS (Maximum Independent Set) • Given a graph G=(V,E) and profit vector p Zn • Maximaizep·x • Subject to: x {0,1}n • xi + xj 1 {i,j} E
Maximum Independent Set in Interval Graphs • Activity9 • Activity8 • Activity7 • Activity6 • Activity5 • Activity4 • Activity3 • Activity2 • Activity1 • time • Maximize s.t.For each instance I: • For each time t:
The Local-Ratio Technique: Basic definitions • Given a penalty [profit] vector p. • Minimize[Maximize]p·x • Subject to: feasibility constraints F(x) • x isr-approximationif F(x) and p·x r·p·x* • An algorithm is r-approximationif for any p, F • it returns an r-approximation
The Local-Ratio Theorem: • xis an r-approximation with respect to p1 • xis an r-approximation with respect to p- p1 • • xis an r-approximation with respect to p • Proof: (For minimization) • p1 · x r ×p1* • p2 · x r ×p2* • • p · x r ×( p1*+ p2*) • r ×(p1 + p2 )*
The Local-Ratio Theorem: (Proof2) • xis an r-approximation with respect to p1 • xis an r-approximation with respect to p- p1 • • xis an r-approximation with respect to p • Proof2: (For minimization) • Let x*, x1*,x2*be optimal solutions for p, p1, p2 respectively • p1 · x r ×p1 x1* • p2 · x r ×p2x2* • • p · x r ×( p1 x1*+ p2x2*) • r ×(p1x*, + p2 x*) = px*
Special case: Optimization is 1-approximation • xis an optimum with respect to p1 • xis an optimum with respect to p- p1 • xis an optimum with respect to p
A Local-Ratio Schema for Minimization[Maximization] problems: • Algorithm r-ApproxMin[Max]( Set, p ) • If Set = Φ then returnΦ ; • If I Setp(I)=0 then return {I} r-ApproxMin( Set-{I}, p ) ; • [If I Setp(I) 0 then returnr-ApproxMax( Set-{I}, p ) ;] • Define “good” p1 ; • REC = r-ApproxMax[Min]( Set, p- p1) ; • If REC is not an r-approximation w.r.t. p1 then “fix it”; • return REC;
The Local-Ratio Theorem: Applications Applications to some optimization algorithms (r = 1): ( MST) Minimum Spanning Tree (Kruskal) ( SHORTEST-PATH) s-t Shortest Path (Dijkstra) (LONGEST-PATH) s-t DAG Longest Path (Can be done with dynamic programming) (INTERVAL-IS) Independents-Set in Interval Graphs Usually done with dynamic programming) (LONG-SEQ) Longest (weighted) monotone subsequence (Can be done with dynamic programming) ( MIN_CUT) Minimum Capacity s,t Cut (e.g. Ford, Dinitz) Applications to some 2-Approximation algorithms: (r = 2) ( VC) Minimum Vertex Cover (Bar-Yehuda and Even) ( FVS) Vertex Feedback Set (Becker and Geiger) ( GSF) Generalized Steiner Forest (Williamson, Goemans, Mihail, and Vazirani) ( Min 2SAT) Minimum Two-Satisfibility (Gusfield and Pitt) ( 2VIP) Two Variable Integer Programming (Bar-Yehuda and Rawitz) ( PVC) Partial Vertex Cover (Bar-Yehuda) ( GVC) Generalized Vertex Cover (Bar-Yehuda and Rawitz) Applications to some other Approximations: ( SC) Minimum Set Cover (Bar-Yehuda and Even) ( PSC) Partial Set Cover (Bar-Yehuda) ( MSP) Maximum Set Packing (Arkin and Hasin) Applications Resource Allocation and Scheduling: ….
The creative part…find -Effective weights • p1 is -Effective if every feasible solution is -approx w.r.t. p1 • i.e. p1 ·x p1* • VC (vertex cover) • Edge • Matching • Greedy • Homogenious
VC: Recursive implementation (edge by edge) • VC (V, E, p) • If E= return ; • If p(v)=0 return {v}+VC(V-{v}, E-E(v), p); • Let (x,y)E; • Let = min{p(x), p(y)}; • Define p1(v) = if v=x or v=y and 0 otherwise; • Return VC(V, E, p- p1) 0 0 0 0 0 0
VC: Iterative implementation (edge by edge) • VC (V, E, p) • for each e E; • let = min{p(v)| v e}; • for each v e • p(v) = p(v) - ; • return {v| p(v)=0}; 0 0 0 0 0 0
8 12 5 20 10 6 Min 5xBisli+8xTea+12xWater+10xBamba+20xShampoo+15xPopcorn+6xChocolate s.t. xShampoo + xWater 1 15
VC: Iterative implementation (edge by edge) • VC (V, E, p) • for each e E; • let = min{p(v)| v e}; • for each v e • p(v) = p(v) - ; • return {v| p(v)=0}; 30 15 90 10 50 100 80 2
VC: Greedy ( O(H()) - approximation)H()=1/2+1/3+…+1/ = O(ln ) • Greedy_VC (V, E, p) • C = ; • while E • let v=arc min p(v)/d(v) • C = C + {v}; • V = V – {v}; • return C; n/ n/4 n/3 n/2 n … …
VC: LR-Greedy (star by star) • LR_Greedy_VC (V, E, p) • C = ; • while E • let v=arc min p(v)/d(v) • let = p(v)/d(v); • C = C + {v}; • V = V – {v}; • for each u N(v) • p(v) = p(v) - ; • return C; 4
VC: LR-Greedy by reducing 2-effective homogeniousHomogenious = all vertices have the same “greedy value” • LR_Greedy_VC (V, E, p) • C = ; • Repeat • Let = Min p(v)/d(v); • For each v V • p(v) = p(v) – d(v); • Move from V to C all zero weight vertices; • Remove from V all zero degree vertices; • Until E= • Return C; 3 4 6 4 3 5 3 2
Example MST (Minimum Spanning Tree) • Given graph G(V,E) and length vector p Z|E| • Minimizep·x • Subject to: x {0,1}|E| • xe 1 cut P • e P
MST: Recursive implementation (Homogenious) • MST (V, E, p) • If V= return ; • If self-loop e return MST(V, E-{e}, p); • If p(e)=0 return {e}+MST(Vshrink(e), Eshrink(e), p); • Let = min{p(e) : eE}; • Define p1(e) = for all eE; • Return MST(V, E, p- p1)
MST: Iterative implementation (Homogenious) • MST (V, E, p) • Kruskal