1 / 47

Optimization problems

Optimization problems. INSTANCE FEASIBLE SOLUTIONS COST . Vertex Cover problem. INSTANCE graph G FEASIBLE SOLUTIONS S  V, such that (  e  E) S  e 0 COST c(S) = |S|. Set Cover problem. INSTANCE family of sets A 1 ,...,A n   FEASIBLE SOLUTIONS

dorcas
Télécharger la présentation

Optimization problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization problems INSTANCE FEASIBLE SOLUTIONS COST

  2. Vertex Cover problem INSTANCE graph G FEASIBLE SOLUTIONS SV, such that (eE) Se0 COST c(S) = |S|

  3. Set Cover problem INSTANCE family of sets A1,...,An   FEASIBLE SOLUTIONS S[n], such that Ai =  COST c(S) = |S| iS

  4. Vertex Cover problem Set Cover problem INSTANCE family of sets A1,...,An FEASIBLE SOLUTIONS S[n], such that  Ai= COST c(S) = |S| INSTANCE graph G FEASIBLE SOLUTIONS SV, such that (eE) Se0 COST c(S) = |S| iS

  5. Vertex Cover problem Set Cover problem INSTANCE family of sets A1,...,An FEASIBLE SOLUTIONS S[n], such that  Ai= COST c(S) = |S| INSTANCE graph G FEASIBLE SOLUTIONS SV, such that (eE) Se0 COST c(S) = |S| iS = E Ai E is the set of edges adjacent to i V

  6. Optimization problems INSTANCE FEASIBLE SOLUTIONS COST OPTIMAL SOLUTION OPT= min c(T) T FEASIBLE SOLUTIONS

  7. -approximation algorithm INSTANCE  T such that c(T)  OPT

  8. Last Class: 2-approximation algorithm for Vertex-Cover 2-approximation algorithm for Metric TSP 1.5-approximation algorithm for Metric TSP

  9. This Class: (1+)-approximation algorithm for Knapsack O(log n)-approximation algorithm for Set Cover

  10. Knapsack INSTANCE: value vi, weight wi, for i {1,...,n} weight limit W FEASIBLE SOLUTION: collection of items S {1,...,n} with total weight  W COST (MAXIMIZE): sum of the values of items in S

  11. Knapsack INSTANCE: value vi, weight wi, for i {1,...,n} weight limit W FEASIBLE SOLUTION: collection of items S {1,...,n} with total weight  W COST (MAXIMIZE): sum of the values of items in S We had: pseudo-polynomial algorithm, time = O(Wn) pseudo-polynomial algorithm, time = O(Vn), where V = v1 + ... +vn

  12. Knapsack INSTANCE: value vi, weight wi, for i {1,...,n} weight limit W FEASIBLE SOLUTION: collection of items S {1,...,n} with total weight  W COST (MAXIMIZE): sum of the values of items in S GOAL convert pseudo-polynomial algorithm, time = O(Vn), where V = v1 + ... +vn into an approximation algorithm IDEA = rounding

  13. Knapsack wlog all wi W M = maximum of vi vi vi’ :=  n vi / (M)  OPT’  n2/ S = optimal solution in original S’ = optimal solution in modified Will show: optimal solution in modified is an approximately optimal solution in original

  14. Knapsack vi vi’ :=  n vi / (M)  S = optimal solution in original S’ = optimal solution in modified (n/(M)) vi vi’  vi’  ( nvi / (M) -1 ) i S’ i S’ i S i S Will show: optimal solution in modified is an approximately optimal solution in original

  15. Knapsack vi vi’ :=  n vi / (M)  S = optimal solution in original S’ = optimal solution in modified (n/(M)) vi vi’  vi’  ( nvi / (M) -1 ) i S’ i S’ i S i S vi ( vi - ) 1 OPT – M  OPT(1– ) n/(M) i S’ i S Will show: optimal solution in modified is an approximately optimal solution in original

  16. Running time? pseudo-polynomial algorithm, time = O(V’n), where V’ = v’1 + ... +v’n M = maximum of vi vi vi’ :=  n vi / (M) 

  17. Running time? pseudo-polynomial algorithm, time = O(V’n), where V’ = v’1 + ... +v’n M = maximum of vi vi vi’ :=  n vi / (M)  v’i n/ V’ n2/ running time = O(n3/)

  18. We have an algorithm for the Knapsack problem, which outputs a solution with value  (1-) OPT and runs in time O(n3/) FPTAS Fully polynomial-time approximation scheme (1+)-approximation algorithm running in time poly(INPUT,1/)

  19. Weighted set cover problem INSTANCE: A1,...,Am, weights w1,...,wm FEASIBLE SOLUTION: collection S of the Ai covering  OBJECTIVE (minimize): the cost of the collection (in the unweighted version we have wi =1)

  20. Weighted set cover problem Greedy algorithm: pick Ai with minimal wi / |Ai| remove elements in Ai from  repeat

  21. Negative example (last class) approximation ratio = (log n)

  22. Weighted set cover problem Greedy algorithm: pick Ai with minimal wi / |Ai| remove elements in Ai from  repeat Theorem: O(log n) approximation algorithm.

  23. Weighted set cover problem Theorem: O(log n) approximation algorithm. Greedy algorithm: pick Ai with minimal wi / |Ai| remove elements in Ai from  repeat when Ai picked, cost of the solution increases by wi Ai everybody pays wi / |Ai|

  24. Weighted set cover problem Let B be a set of weight w. How much did the guys in B pay? pick me! cost=w/B B

  25. Weighted set cover problem Theorem: O(log n) approximation algorithm. Greedy algorithm: pick Ai with minimal wi / |Ai| remove elements in Ai from  repeat sorry Ai was cheaper pick me! cost=w/B B paid less than w/B Ai

  26. Weighted set cover problem Theorem: O(log n) approximation algorithm. Greedy algorithm: pick Ai with minimal wi / |Ai| remove elements in Ai from  repeat B

  27. Weighted set cover problem continue, size of B went down by 1 pick me! cost=w/(B-1) B

  28. Weighted set cover problem Theorem: O(log n) approximation algorithm. Greedy algorithm: pick Ai with minimal wi / |Ai| remove elements in Ai from  repeat sorry Aj was cheaper pick me! cost=w/(B-1) B paid less than w/(B-1) Aj

  29. Weighted set cover problem Theorem: O(log n) approximation algorithm. Greedy algorithm: pick Ai with minimal wi / |Ai| remove elements in Ai from  repeat B

  30. Weighted set cover problem continue, size of B went down by 1 pick me! cost=w/(B-2) B

  31. Weighted set cover problem B paidw/B paidw/(B-1) paidw/2 paidw/(B-2) paidw vertices in order they are covered by greedy

  32. Weighted set cover problem B paidw/B paidw/(B-1) paidw/2 paidw/(B-2) paidw TOTAL PAID  w (1/B + 1/(B-1) + ... +1/2 + 1) = w O(ln B) = w O(ln n)

  33. Weighted set cover problem INSTANCE: A1,...,Am, weights w1,...,wm FEASIBLE SOLUTION: collection S of the Ai covering  OBJECTIVE (minimize): the cost of the collection Greedy algorithm: pick Ai with minimal wi / |Ai| remove elements in Ai from  repeat Theorem: O(log n) approximation algorithm.

  34. Clustering n points in Rm d(i,j) = distance between points i,j partition the points into k clusters of small diameter diam(C) = max d(i,j) i,jC

  35. Clustering k = 3

  36. Clustering k = 3

  37. Clustering k = 2

  38. Clustering k = 2

  39. k-Clustering INSTANCE n points in Rm FEASIBLE SOLUTION partition of [n] into C1,...,Ck COST max diam(Ci) i[k] diam(C) = max d(i,j) i,jC

  40. k-Clustering GREEDY ALGORITHM pick s1 [n] for i from 2 to k do pick si the farthest point from s1,...,si-1 Ci = {x [n] whose closest center is si}

  41. k-Clustering GREEDY ALGORITHM pick s1 [n] for i from 2 to k do pick si the farthest point from s1,...,si-1 Ci = {x [n] whose closest center is si} s1

  42. k-Clustering GREEDY ALGORITHM pick s1 [n] for i from 2 to k do pick si the farthest point from s1,...,si-1 Ci = {x [n] whose closest center is si} s1 s2

  43. k-Clustering GREEDY ALGORITHM pick s1 [n] for i from 2 to k do pick si the farthest point from s1,...,si-1 Ci = {x [n] whose closest center is si} s1 s2 s3

  44. k-Clustering GREEDY ALGORITHM pick s1 [n] for i from 2 to k do pick si the farthest point from s1,...,si-1 Ci = {x [n] whose closest center is si} s1 s2 s3

  45. k-Clustering GREEDY ALGORITHM pick s1 [n] for i from 2 to k do pick si the farthest point from s1,...,si-1 Ci = {x [n] whose closest center is si} Theorem: GREEDY ALGORITHM IS A 2-APPROXIMATION ALGORITHM

  46. Theorem: GREEDY ALGORITHM IS A 2-APPROXIMATION ALGORITHM s2 s1 sk sk+1 d(si,sj)  d(sk+1,{s1,...,sk}) = r OPT  r cost of greedy  2r

More Related