1 / 57

Approximation Algorithms for NP-hard Combinatorial Problems

Approximation Algorithms for NP-hard Combinatorial Problems. Magnús M. Halldórsson Reykjavik University. www.ru.is/~mmh/ewscs13/lec1.ppt. Post-doc & Ph.D. Positions available. Part of research project titled “Algorithms for Wireless Scheduling”. Why? Motivation.

Télécharger la présentation

Approximation Algorithms for NP-hard Combinatorial Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Approximation Algorithms for NP-hard Combinatorial Problems Magnús M. Halldórsson Reykjavik University www.ru.is/~mmh/ewscs13/lec1.ppt

  2. Post-doc & Ph.D. Positions available • Part of research project titled “Algorithms for Wireless Scheduling”

  3. Why? Motivation • Many computational problems are NP-hard • Instead of looking for the best solution, we may want to find a solution that is good enough • Instead of an optimalsolution, we seek approximations Most

  4. Why study approximation algorithms? • Study valuable, general-purpose problem solving techniques • Interesting combinatorics, from a CS viewpoint • Related to various other paradigms: • Online algorithms • Streaming algorithms • Distributed algorithms

  5. What I expect that you already know • Discrete structures and problems • Graphs and networks • Basic knowledge of algorithms • Some experience with analysis of algorithms • E.g. why Dijkstra’s algorithm works • Bonus: • Mathematical programming • Basic probability theory

  6. Plan today • Get comfortable with our main problems • Overview the algorithmic strategies that we will examine • Look at the most naive approximation algorithms

  7. Fundamental problems • There are millions of optimization problems • We focus on a few fundamental problems: • The techniques developed can be relatively easily transferred to other problems • The classic graph problems, w/o connectivity

  8. Problems we shall consider • Independent set • Vertex cover • Chromatic number • Dominating set • Domatic number • Max Cut • And sometimes weighted versions

  9. Graphs and Notation • n=9 • m=13 • =4 • =2 • d=26/9

  10. Independent sets • Find: An independent set S of vertices • Objective: Maximize |S|

  11. Independent sets • Find: An independent set of vertices • Maximize: Size

  12. Intervals & corresponding interval graph

  13. Interval selection & corresponding interval graph

  14. Interval selection & corresponding interval graph

  15. k=2 distance-k-independent set in graph G = independent set in the power graphGk • (u,v)  E(Gk)  distG(u,v)  k

  16. Matching • Input: Graph. • Goal. Find maximumcardinality matching. A 1 B 2 C 3 D 4 E 5

  17. Vertex Cover • S is a vertex cover if every edge has at least one endpoint in S • Objective: Minimize |S|

  18. Vertex Cover • S is a vertex cover if every edge has at least one endpoint in S • Objective: Minimize |S|

  19. Map coloring

  20. Austria and its neighbors ?

  21. Graph Coloring

  22. Chromatic Number (G) = 3

  23. Where to locate icecream stands? • The Icecream Stands problem: • Every kid should haveaccess to an icecreamstand in the next street. How many stands are needed? • A dominating set for a graph is a set of vertices whose neighbors, along with themselves, constitute all the vertices in the graph.

  24. Domatic number • Find: A partition (coloring) of the verticessuch that each color is a dominating set • Objective: Maximize the number of colors A B F C E

  25. The MAX CUT problem Input. A graph. Feasible solution. A set S of vertices. Value of solution. Number of edges cut. Objective. Maximize.

  26. Weighted versions • In Weighted Independent Set: • Given a graph G with weights on vertices • Find an independent set with maximum total weight • Weights on vertices: • Ind.set, Vertex cover, Dominating set • Weights on edges • Max Cut

  27. Algorithmic strategies • Greedy • Subgraph removal • Local search • Probabilistic method • Local ratio • Linear programming rounding • Semi-definite programming

  28. Core issue: Bounding OPT • We want to compare „our“ solution, ALG, to OPT, the optimal solution • I.e., show that ALG  * OPT • But OPT is hard to get a handle on (NP-hard!) • Instead, we compare ALG to an easier property that bounds OPT: •  (I) OPT(I), and • ALG(I) (I)

  29. Greedy • Previously known as „myopic“ • In each round, the algorithm makes the action that is best according to some criteria

  30. Subgraph removal • An independent set can contain at most 1 vertex from a clique • If we remove all big cliques, the problem may become easier. • If OPT was large, it is still fairly large after removing the cliques

  31. Local search • Can I make small, „local“ changes to my solution, to get a better solution? • Ex. 1: „Shortcuts“ • Ex. 2: If I throw out 1 vertex, can I replace it with 2?

  32. Probabilistic method • A „random“ solution is often pretty good • Sometimes close to best possible • Sometimes it needs minor changes • Basic facts from probability: • Product rule • Union bound • Linearity of expectation

  33. LP rounding • Linear programming is a powerful hammer • The most general purpose solving method • All of our problems can be written as Integer Programming problems • X_i = 1, if v_i is in the independent set, • X_i = 0, if v_i is not in the set • LP: Allow any value in the range [0,1] • Solvable in polynomial time • „Rounding“: Use LP-value to choose the binary 0/1 value

  34. Semi-definite programming • „Vector programming“ assigns each vertex a vector in n-dimensional space • LP value = 1-dimensional vector • Use nearness or farness to decide on rounding

  35. Performance ratio • An algorithm A is  -approximate if, for every input instance I,A(I)  OPT(I) (maximization)or A(I)  OPT(I) (minimization)

  36. VC - Approximation Algorithm • C   • E’  E • whileE’   • do let (u,v) be an arbitrary edge of E’ • C  C  {u,v} • remove from E’ every edge incident to either u or v. • return C.

  37. Demo Compare this cover to theone from the example

  38. O(n2) O(1) O(n2) O(n) Polynomial Time • C   • E’  E • whileE’  do • let (u,v) be an arbitrary edge of E’ • C  C  {u,v} • remove from E’ every edge incident to either u or v • return C

  39. Correctness The set of vertices our algorithm returns is clearly a vertex-cover, since we iterate until every edge is covered.

  40. no common vertices! How Good an Approximation is it? Observe the set of edges our algorithm chooses  any VC contains 1 in each our VC contains both, hence at most twice as large

  41. Another algorithm Algorithm 1.2 (Cardinality vertex cover) • Find a maximal matching in G and output the set of matched vertices • Given a graph G=(V,E), a subset of the edges M  E is a matching if no two edges of M share an endpoint. • A matching is maximalif no more edges can be added • The previous solution is a maximal matching!

  42. Linking ALG to OPT(relating the approximation to the optimal) Maximal matching Maximal matching ? Optimalsolution Approximatesolution

  43. Lower bounding OPT • OPT : an optimal solution (vertex cover) (also its size) • ALG : solution found by our algorithm • M : maximal matching • We argued a lower bound for OPT in terms of M: OPT ≥ |M| • Thus, maximal matching is the combinatorial structure that is relevant here. • Performance analysis follows from this: • Since, ALG = 2 |M|, the performance ratio is at most 2

  44. Can we improve this approximation? • Can we improve the analysis for this algorithm? • Can we make this algorithm more clever? • What if we carefully choose the maximal matching? • Can we find a better algorithm?

  45. Tight example #1 OPT picks 1 vertex ALG selected 2 But, this is only one small graph. We want a family of graphs, of infinite size

  46. A matching and the algorithm solution An optimal solution • Any maximal matching contains all the 2n vertices • So the algorithm solution always contains 2n verts. • Optimal solution contains only nvertices Tight example #2 • Complete bipartite graph Kn,n (here n=4)

  47. “This was a too simple solution” • It is simple, but not stupid. Many seemingly smarter heuristics can give a far worse performance, in the worst case. • It is easy to complicate heuristics by adding more special cases or details. For example, if you just picked the edge whose endpoints had highest degree, it doesn’t improve the worst case. It just makes it harder to analyze.

  48. Naive approximations • Bound ALG and OPT, independently • Forms the baseline, with which we compare

More Related