1 / 84

Approximation Algorithms

Approximation Algorithms. Greedy Strategies. Max and Min. min f is equivalent to max – f . However, a good approximation for min f may not be a good approximation for max – f .

jblakeley
Télécharger la présentation

Approximation Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Approximation Algorithms Greedy Strategies

  2. Max and Min • min f is equivalent to max –f. • However, a good approximation for min f may not be a good approximation for max –f. • For example, consider a graph G=(V,E). C is a minimum vertex cover of G iff V \ C is a maximum independent set of G. The minimum vertex cover has a polynomial-time 2-approximation, but the maximum independent set has no constant-bounded approximation unless NP=P. • Another example: Minimum Connected Dominating Set and Minimum Spanning Tree with Maximum Number of Leaves I hear, I forget. I learn, I remember. I do, I understand!

  3. Greedy for Max and Min • Max --- independent system • Min --- submodular potential function I hear, I forget. I learn, I remember. I do, I understand!

  4. Independent System I hear, I forget. I learn, I remember. I do, I understand!

  5. Independent System • Consider a set E and a collection C of subsets of E. (E,C) is called an independent system if The elements of C are called independent sets I hear, I forget. I learn, I remember. I do, I understand!

  6. Maximization Problem I hear, I forget. I learn, I remember. I do, I understand!

  7. Greedy Approximation MAX I hear, I forget. I learn, I remember. I do, I understand!

  8. Theorem I hear, I forget. I learn, I remember. I do, I understand!

  9. Proof I hear, I forget. I learn, I remember. I do, I understand!

  10. I hear, I forget. I learn, I remember. I do, I understand!

  11. I hear, I forget. I learn, I remember. I do, I understand!

  12. Maximum Weight Hamiltonian Cycle • Given an edge-weighted complete graph, find a Hamiltonian cycle with maximum total weight. I hear, I forget. I learn, I remember. I do, I understand!

  13. Independent sets • E = {all edges} • A subset of edges is independent if it is a Hamiltonian cycle or a vertex-disjoint union of paths. • C = a collection of such subsets I hear, I forget. I learn, I remember. I do, I understand!

  14. Maximal Independent Sets • Consider a subset F of edges. For any two maximal independent sets I and J of F, |J| < 2|I| I hear, I forget. I learn, I remember. I do, I understand!

  15. I hear, I forget. I learn, I remember. I do, I understand!

  16. Theorem: For the maximum Hamiltonian cycle problem, the greedy algorithm MAX produces a polynomial time approximation with performance ratio at most 2. I hear, I forget. I learn, I remember. I do, I understand!

  17. Maximum Weight Directed Hamiltonian Cycle • Given an edge-weighted complete digraph, find a Hamiltonian cycle with maximum total weight. I hear, I forget. I learn, I remember. I do, I understand!

  18. Independent sets • E = {all edges} • A subset of edges is independent if it is a directed Hamiltonian cycle or a vertex-disjoint union of directed paths. I hear, I forget. I learn, I remember. I do, I understand!

  19. I hear, I forget. I learn, I remember. I do, I understand!

  20. Tightness The rest of all edges have a cost ε ε 1 1 1 1+ε I hear, I forget. I learn, I remember. I do, I understand!

  21. A Special Case • If c satisfies the following quadrilateral condition: For any 4 vertices u, v, u’, v’ in V, Then the greedy approximation for maximum weight Hamiltonian cycle has the performance ratio 2. I hear, I forget. I learn, I remember. I do, I understand!

  22. I hear, I forget. I learn, I remember. I do, I understand!

  23. I hear, I forget. I learn, I remember. I do, I understand!

  24. I hear, I forget. I learn, I remember. I do, I understand!

  25. I hear, I forget. I learn, I remember. I do, I understand!

  26. Superstring • Given n strings s1, s2, …, sn, find a shortest string s containing all s1, s2, …, sn as substrings. • No si is a substring of another sj. I hear, I forget. I learn, I remember. I do, I understand!

  27. An Example • Given S = {abcc, efaab, bccef} • Some possible solutions: • Concatenate all substrings = abccefaabbccef (14 chars) • A shortest superstring is abccefaab (9 chars) I hear, I forget. I learn, I remember. I do, I understand!

  28. Relationship to Set Cover? • How to “transform” the shortest superstring (SS) to the Set Cover (SC) problem? • Need to identify U • Need to identify S • Need to define the cost function • The SC instance is an SS instance • Let U = S (a set of n strings). • How to define S ? I hear, I forget. I learn, I remember. I do, I understand!

  29. Relationship to SC (cont) | | | | | | Let M be the set that consists of the strings σijk k si sj σijk I hear, I forget. I learn, I remember. I do, I understand!

  30. Relationship to SC (cont) Now, define S Define cost of Let C is the set cover of this constructed SC, then the concatenation of all strings in C is a solution of SS. Note that C is a collection of I hear, I forget. I learn, I remember. I do, I understand!

  31. Algorithm 1 for SS I hear, I forget. I learn, I remember. I do, I understand!

  32. Approximation Ratio • Lemma 1: Let opt be length of the optimal solution of SS and opt’ be the cost of the optimal solution of SC, we have: opt ≤ opt’ ≤ 2opt • Proof: I hear, I forget. I learn, I remember. I do, I understand!

  33. Proof of Lemma 1 (cont) I hear, I forget. I learn, I remember. I do, I understand!

  34. Approximation Ratio • Theorem1: Algorithm 1 has an approximation ratio within a factor of 2Hn • Proof: We know that the approximation ratio of Set Cover is Hn. From Lemma 1, it follows directly that Algorithm 1 is a 2Hn factor algorithm for SS I hear, I forget. I learn, I remember. I do, I understand!

  35. Prefix and Overlap • For two string s1 and s2, we have: • Overlap(s1,s2) = the maximum between the suffix of s1 and the prefix of s2. • pref(s1,s2) = the prefix of s1 that remains after chopping off overlap(s1,s2) • Example: • s1 = abcbcaa and s2= bcaaca, then • overlap(s1,s2) = bcaa • pref(s1,s2) = abc • Note: overlap(s1,s2) ≠ overlap(s2, s1) I hear, I forget. I learn, I remember. I do, I understand!

  36. Is there any better approach? • Now, suppose that in the optimal solution, the strings appear from the left to right in the order: s1, s2, …, sn • Define: opt = |pref(s1,s2)| + …+|pref(sn-1,sn)| + |pref(sn,s1)| + |overlap(sn,s1)| Why overlap(sn,s1)? Consider this exampleS={agagag, gagaga}. If we just consider the prefix only, the result would be ag whereas the correct result is agagaga I hear, I forget. I learn, I remember. I do, I understand!

  37. Prefix Graph • Define the prefix graph as follows: • Complete weighted directed graph G=(V,E) • V is a set of vertices, labeled from 1 to n (each vertex represents each string si) • For each edge i→j, i ≠ j, assign a weight of |pref(si, sj)| • Example: S={abc, bcd, dab} 1( ) ( )3 2( ) I hear, I forget. I learn, I remember. I do, I understand!

  38. Cycle Cover • Cycle Cover: a collection of disjoint cycles covering all vertices (each vertex is in exactly one cycle) • Note that the tour 1 → 2 → … → n → 1 is a cycle cover • Minimum weight cycle cover: sum of weights is minimum over all covers • Thus, we want to find a minimum weight cycle cover I hear, I forget. I learn, I remember. I do, I understand!

  39. How to find a min. weight cycle cover • Corresponding to the prefix graph, construct a bipartite graph H=(X,Y;E) such that: • X = {x1, x2, …, xn} and Y = {y1, y2, …, yn} • For each i, j (in 1…n), add edge (xi, yj) of weight |pref(si,sj)| • Each cycle cover of the prefix graph ↔ a perfect matching of the same weight in H. (Perfect matching is a matching which covers all the vertices) • Finding a minimum weight cycle cover = finding a minimum weight perfect matching (which can be found in poly-time) I hear, I forget. I learn, I remember. I do, I understand!

  40. s12 s13 s11 How to break the cycle I hear, I forget. I learn, I remember. I do, I understand!

  41. A constant factor algorithm Algorithm 2: I hear, I forget. I learn, I remember. I do, I understand!

  42. Approximation Ratio • Lemma 2: Let C be the minimum weight cycle cover of S. Let c and c’ be two cycles in C, and let r, r’ be representative strings from these cycles. Then |overlap(r, r’)| < w(c) + w(c’) • Proof: Exercise I hear, I forget. I learn, I remember. I do, I understand!

  43. Approximation Ratio (cont) • Theorem 2: Algorithm 2 has an approximation ratio of 4. • Proof: (see next slide) I hear, I forget. I learn, I remember. I do, I understand!

  44. Proof I hear, I forget. I learn, I remember. I do, I understand!

  45. Modification to 3-Approximation I hear, I forget. I learn, I remember. I do, I understand!

  46. 3-Approximation Algorithm • Algorithm 3: I hear, I forget. I learn, I remember. I do, I understand!

  47. Superstring via Hamiltonian path • |ov(u,v)| = max{|w| | there exist x and y such that u=xw and v=wy} • Overlap graph G is a complete digraph: V = {s1, s2, …, sn} |ov(u,v)| is edge weight. • Suppose s* is the shortest supper string. Let s1, …, sn be the strings in the order of appearance from left to right. Then si, si+1 must have maximum overlap in s*. Hence s1, …, sn form a directed Hamiltonian path in G. I hear, I forget. I learn, I remember. I do, I understand!

  48. I hear, I forget. I learn, I remember. I do, I understand!

  49. I hear, I forget. I learn, I remember. I do, I understand!

  50. The Algorithm (via Hamiltonian) I hear, I forget. I learn, I remember. I do, I understand!

More Related