1 / 48

Approximations for Combinatorial Optimization

Approximations for Combinatorial Optimization. Moses Charikar. based on joint work with Amit Agarwal, Adriana Karagiozova, Konstantin Makarychev and Yury Makarychev. Optimization Problems. Shortest paths Minimum cost network Scheduling, Load balancing Graph partitioning problems

sienna
Télécharger la présentation

Approximations for Combinatorial Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Approximations for Combinatorial Optimization Moses Charikar based on joint work with Amit Agarwal, Adriana Karagiozova, Konstantin Makarychev and Yury Makarychev

  2. Optimization Problems • Shortest paths • Minimum cost network • Scheduling, Load balancing • Graph partitioning problems • Constraint satisfaction problems

  3. Approximation Algorithms • Many optimization problems NP-hard • Alternate approach: heuristics with provable guarantees • Guarantee: Alg(I)   OPT(I) (maximization)Alg(I)   OPT(I) (minimization) • Complexity theory gives bounds on best approximation ratios possible

  4. i ¢ x m n c b A ¸ x Mathematical Programming approaches • Sophisticated tools from convex optimization • e.g. Linear programming • Can find optimum solution in polynomial time

  5. Relax and Round • Express solution in terms of decision variables, typically {0,1} or {-1,+1} • Feasibility constraints on decision variables • Objective function • Relax variables to get mathematical program • Solve program optimally • Round fractional solution

  6. LP is a widely used tool in designing approximation algorithms • Interpret variables values as probabilities, distances, etc. integer solutions fractional solutions

  7. Distance function from LPs • [Leighton, Rao ’88] Distance function d. • Triangle ineq.:d(a, b) + d(b, c) ¸ d(a, c) d = 1 d = 0 d = 0 1 a c 0 0 b

  8. Quadratic programming • Linear expressions in xi xj ? • NP-hard • Workaround: Mij = xi xj • What can we say about M ? • M is positive semidefinite (psd) • Can add psd constraint • Semidefinite programming • Can solve to any desired accuracy

  9. Positive Semidefinite Matrices • M is psd iff • xT M x  0 for all x • All eigenvalues of M are non-negative • M = VT V (Cholesky decomposition) • Mij = vi vj

  10. Vector Programming • Variables are vectors • Linear constraints on dot products • Linear objective on dot products

  11. 2 2 ( ( ) ) ¡ ¡ x v v x X X i i j j m m a a x x 4 4 ( ( ) ) E E i i j j 2 2 ; ; f l l 2 i 1 f l l i 1 ¢ v v o r a = i i x o r a = i Relaxation for Max Cut

  12. SDP solution • Geometric embedding of vertices • Hyperplane rounding

  13. Can we do better ? • Better analysis ? rounding algorithm ? • Add constraints to the relaxation. • -inequality constraints: • (vi –vj)2 + (vj –vk)2  (vi – vk)2

  14. j i Sparsest Cut uniform demands ( ) ± S T ; i m n j j j j S T ¢ S T

  15. i j ( ) ± i j ; Cut Metric 1 0 0 S T Use relaxations of cut metrics

  16. Approximate cut metrics • How well can relaxed metrics be mapped into cut metrics ? • Metrics from LPs: log n distortion gives log n approximation[Bourgain ’85] [LLR ’95] [AR ’95] • SDP with -inequalities ? • geometry of l22 metrics • Goemans-Linial conjecture:l22 metrics embed into l1 with constant distortion.

  17. p l o g n Arora-Rao-Vazirani • [ARV ’04] • Breakthrough for SDPs with -inequalities • approximation for balanced cut and sparsest cut

  18. How good are these SDP methods ?Can we do better ?

  19. 8 ( ) d 2 1 3 3 1 7 + ´ x x m o 1 4 > > > ( ) d 1 6 4 1 7 + < ´ x x m o 3 2 > : : : > > ( ) d : 5 3 9 1 7 + ´ x x m o 1 9 Unique Games • Linear equations mod p • 2 variables per equation • maximize number of satisfied constraints • In every constraint, for every value of one variable, unique value of other variable satisfies the constraint. • If 99% of equations are satisfiable, can we satisfy 1% of them ?

  20. Unique Games Conjecture • [Khot ’02]Given a Unique Games instance where 1-fraction of constraints is satisfiable, it is NP-hard to satisfy evenfraction of all constraints. (for every constant positive  and  and sufficiently large domain size k).

  21. Implications of UGC • 2 is best possible for Vertex cover [KR ’03] • 0.878 is best possible for Max Cut [KKMO ’04] [MOO ’05] • (1) for sparsest cut(1) for min 2CNF deletion[CKKRS ’05] [KV ’05]

  22. Algorithms for Unique Games • Domain size k, OPT = 1- • Random solution satisfies 1/k • Non-trivial results only for  = 1/poly(k)[AEH ’01] [Khot ’02] [Trevisan ’05] [GT ’06] 1 -   0 1

  23. ³ ´ p l k O 1 ¡ " o g " ¡ ¡ ¢ k ­ ¡ 2 " New results for Unique Games • [CMM ’05] • Given an instance where 1-fraction of constraints is satisfiable, we satisfy • We can also satisfy:

  24. ³ ´ p l k O 1 ¡ " o g = l k 1 1 0 c o g = 2 1 5 ( ) k O 1 = ( ) ¡ 2 1 ¡ ¡ ¡ " " " k k New results for Unique Games • Algorithms cover the entire range of .

  25. Seems distant from UGC setting • Optimal if UGC is true ![KKMO ’05] [MOO ’05] • Any improvement will disprove UGC 1 -   0 1

  26. p h l k G i t t > ( ¢ v e n a g u o g " ¡ k ¡ 2 " p [ ] h l k ? P i t > ¢ w a s r g v o g p ( ) l k O 1 ¡ " o g Matching upper and lower bounds ? g Gaussian random vector v u u · v = 1 

  27. If pigs could whistle … • UGC seems to predict limitations of SDPs correctly • UGC based hardness for many problems matching best SDP based approximation • UGC inspired constructions of gap examples for SDPs • Disproof of Goemans-Linial conjecturel22 metrics do not embed into l1 with constant distortion. [KV ’05]

  28. Is UGC true ? • Points to limitations of current techniques • Focuses attention on common hard core of several important optimization problems • Motivates development of new techniques

  29. Approaches to disproving UGC • Lifting procedures for SDPs • Lovasz-Schrijver, Sherali-Adams, Lasserre • Simulate products of k variables • Can we use them ?

  30. Moment matrices • SDP solution gives covariance matrix M • There exist normal random variables with covariances Mij • Basis for SDP rounding algorithms • There exist {+1,-1} random variables with covariances Mij/log n • Is something similar possible for higher order moment matrices ?

  31. Glimpse of research directions • Whirlwind tour • Quick intro to variety of problems my group has looked at recently • No details !

  32. Tighter local versus global properties of metric spaces Moses Charikar Konstantin Makarychev Yury Makarychev

  33. Local versus Global • Local properties: properties of subsets • Global properties: properties of entire set • What do local properties tell us about global properties ? • Property of interest: embeddability in normed spaces

  34. Motivations • Natural mathematical question • Many Questions of similar flavor have been studied • Lift-and-project methods in optimization • Can guarantee local properties • Need guarantee on global property

  35. Local versus global distortion • Metric on n points • Property : Embeddability into l1 • Dloc : distortion for embedding any subset of size k • Dglob : distortion for embedding entire metric • What is the relationship between Dloc and Dglob ?

  36. µ µ ¶ ¶ 2 ( = ) l l k µ ¶ ( = ) ( = ) k l ­ k 1 2 o g o g n n ( ( ( ( ( ) ( = ( = = ) ) ) ) ) ) l l l k k k ­ O O D D D o g n ~ o g n o o n g g n n ­ ­ ­ D ( = ) l l k ± 1 l o g o g o g n Results

  37. Aspects of Network Design: Adriana Karagiozova • Multicommodity Buy at Bulk Network DesignCharikar, Karagiozova • Terminal Backup and Simplex MatchingAnshelevich, Karagiozova

  38. Multicommodity Buy at Bulk Network Design • installing capacity in communication networks • Given graph, costs for installing capacity on edges • Pairs of communicating nodes with capacity demands • Goal: Install capacity on network to satisfy all users • Costs exhibit economy of scale

  39. Algorithm Overview • Assume all unit demands • Simple greedy algorithm • Randomly permute demand pairs • “Inflate” demands so that kth demand is n/k • Install capacity in greedy fashion to route kth demand • Intuition: • Inflated demands encourage large investments in the network that will be useful later • first k pairs in random permutation act like random sample that suggest where investments should be made

  40. Terminal Backup and Simplex Matching • Suggested by Mung Chiang’s group • Given a graph, set of terminals, edge costs • Find minimum cost set of edges, so that every terminal connected to at least one other • Generalization of classical matching • Algorithm: generalization of augmenting paths for matchings

  41. New Results in Approximate Optimization:Amit Agarwal • Directed Cut Problems [Agarwal, Alon, Charikar] • Advantage of Network Coding for Improving Throughput [Agarwal, Charikar]

  42. Part 1 s1 s2 source sink t1 t2 Undirected Cut Problems • G = (V,E): undirected graph • Cost on edges: ce: E ! R+ • Min-cut • Single source and sink • 2 source-sink pairs • Polynomial time [Hu]

  43. Multicut: multiple source-sink pairs • k source-sink pairs si-ti, i 2 [k] • NP-hard • O(log k) approximation [Garg-Vazirani-Yannakakis ’96] • Objective Cheapest E’ µ E so that, in (V,E\E’), no ti can be reached from its corresponding si Alternate definition:At least one edge from every si-ti path should be removed

  44. Part 1 s1 s2 source sink t1 t2 Undirected Cut Problems • G = (V,E): undirected graph • Cost on edges: ce: E ! R+ • Min-cut • Single source and sink • 2 source-sink pairs • Polynomial time [Hu]

  45. Multicut: multiple source-sink pairs • k source-sink pairs si-ti, i 2 [k] • NP-hard • O(log k) approximation [Garg-Vazirani-Yannakakis ’96] • Objective Cheapest E’ µ E so that, in (V,E\E’), no ti can be reached from its corresponding si Alternate definition:At least one edge from every si-ti path should be removed

  46. Outline • Directed Cut Problems • Advantage of Network Coding for Improving Throughput

  47. Part 2 Definitions • Graph G = (V,E) • Capacities ce: E ! R+ • Source m0 and k terminals m1,…,mk • Send b bits of information to all • Steiner tree connects source to all terminals • Min. cost Steiner tree is cheapest such tree • : Set of all Steiner trees

  48. Advantage of Network Coding • All edges have capacity 1 • Without Coding: < 1 of A and B to m1 and m2 • With Coding: 1 of A and B to m1 and m2 • What is the maximum increase in capacity from network coding ? • Our result: Maximum increase is the same as integrality gap of Steiner tree relaxation m0 A B B A A B Axor B Axor B m1 m2 A B B A

More Related