1 / 41

New Insights into Semidefinite Programming for Combinatorial Optimization

New Insights into Semidefinite Programming for Combinatorial Optimization. Moses Charikar Princeton University. Optimization Problems. Shortest paths Minimum cost network Scheduling, Load balancing Graph partitioning problems Constraint satisfaction problems. Approximation Algorithms.

daw
Télécharger la présentation

New Insights into Semidefinite Programming for Combinatorial Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. New Insights into Semidefinite Programming for Combinatorial Optimization Moses Charikar Princeton University

  2. Optimization Problems • Shortest paths • Minimum cost network • Scheduling, Load balancing • Graph partitioning problems • Constraint satisfaction problems

  3. Approximation Algorithms • Many optimization problems NP-hard • Alternate approach: heuristics with provable guarantees • Guarantee: Alg(I)   OPT(I) (maximization)Alg(I)   OPT(I) (minimization) • Complexity theory gives bounds on best approximation ratios possible

  4. i ¢ x m n c b A ¸ x Mathematical Programming approaches • Sophisticated tools from convex optimization • e.g. Linear programming • Can find optimum solution in polynomial time

  5. Relax and Round • Express solution in terms of decision variables, typically {0,1} or {-1,+1} • Feasibility constraints on decision variables • Objective function • Relax variables to get mathematical program • Solve program optimally • Round fractional solution

  6. LP is a widely used tool in designing approximation algorithms • Interpret variables values as probabilities, distances, etc. integer solutions fractional solutions

  7. Quadratic programming • Linear expressions in xi xj ? • NP-hard • Workaround: Mij = xi xj • What can we say about M ? • M is positive semidefinite (psd) • Can add psd constraint • Semidefinite programming • Can solve to any desired accuracy

  8. Positive Semidefinite Matrices • M is psd iff • xT M x  0 for all x • All eigenvalues of M are non-negative • M = VT V (Cholesky decomposition) • Mij = vi vj

  9. Vector Programming • Variables are vectors • Linear constraints on dot products • Linear objective on dot products

  10. Max-Cut • Given graph G • Partition vertices into two sets • Maximize number of edges cut • Random solution cuts half the edges • Nothing better known until Goemans-Williamson came along !

  11. 2 2 ( ( ) ) ¡ ¡ x v v x X X i i j j m m a a x x 4 4 ( ( ) ) E E i i j j 2 2 ; ; f l l 2 i 1 f l l i 1 ¢ v v o r a = i i x o r a = i Relaxation for Max Cut

  12. SDP solution • Geometric embedding of vertices • Hyperplane rounding

  13. [ ] = µ E t c u ¼ i ¸ m n ( ( ) ) = µ S D P 1 2 ¡ µ c o s Rounding SDP solution • Pick random vector r • Partition vertices according to sign(vi·r) • Prob[(i,j) cut] = ij / • Contribution of (i,j) to SDP = (1-cos ij)/2 • 0.878 approximation

  14. Can we do better ? • Better analysis ? rounding algorithm ? • [Karloff ‘97] guarantee for random hyperplane cannot be improved. • [Feige, Schechtman ’01]SDP value can differ from optimal by 0.878

  15. An Improved Bound ? • Add constraints to the relaxation. • -inequality constraints: • (vi –vj)2 + (vj –vk)2  (vi – vk)2 • [Feige, Schechtman ’01]showed gap for SDP with -inequalities, slightly better than 0.878

  16. SDP applications • DiCut, Max k-Cut • Constraint satisfaction problems2-SAT, 3-SAT • Graph coloring • …

  17. j i Sparsest Cut uniform demands ( ) ± S T ; i m n j j j j S T ¢ S T

  18. ( ) d f h d i i j t t c c - c a p a c y o e e g e i i i j j j ; X X ( ) d d d f h d i i j t - e m a n o e p a r c i j i i j j ; ( S ) T i j E i j 2 2 2 ; ; S T i j 2 2 ; ( ) ± S T i ; m n i m n j j j j S T ¢ Sparsest Cut S T non-uniform demands

  19. i j ( ) ± i j ; Cut Metric 1 0 0 S T Use relaxations of cut metrics

  20. Distance function from LPs • [Leighton, Rao ’88] Distance function d. • Triangle ineq.:d(a, b) + d(b, c) ¸ d(a, c) • Rounding LP solution involves mapping distance function to combination of cut metrics d = 1 d = 0 d = 0 1 a c 0 0 b

  21. Relaxed cut metrics • How well can relaxed metrics be mapped into cut metrics ? • Metrics from LPs: log n distortion gives log n approximation[Bourgain ’85] [LLR ’95] [AR ’95] • SDP with -inequalities ? • (vi –vj)2 + (vj –vk)2  (vi – vk)2 • geometry of l22 metrics • Goemans-Linial conjecture:l22 metrics embed into l1 with constant distortion.

  22. p l o g n Arora-Rao-Vazirani • [ARV ’04] • Breakthrough for SDPs with -inequalities • approximation for balanced cut and sparsest cut

  23. ARV-Separation Theorem • [Arora, Rao, Vazirani ’04, • Lee ‘05]: • Unit vectors visatisfy triangle inequalities: • |vi – vj|2 + |vj – vk|2¸ |vi – vk|2 • (and a spreading constraint) • ) sets S and T, that: • =(1/(log n)½) separated • contain a const. fraction of all vertices (each) S  T

  24. p p l l o o g g n n Applications • min unCut • approximation [ACMM ’05] • Min 2-CNF deletion • approximation [ACMM ’05] • Directed analog of ARV separation lemma

  25. Applications • Arrangement problems • Minimum Linear Arrangement [CHKR ’06] [FL ’06] • Embedding in d-dimensions[CMM ’07] • Graph coloring • O(n0.2)coloring of 3-colorable graphs[ACC ’06]

  26. How good are these SDP methods ?Can we do better ?

  27. 8 ( ) d 2 1 3 3 1 7 + ´ x x m o 1 4 > > > ( ) d 1 6 4 1 7 + < ´ x x m o 3 2 > : : : > > ( ) d : 5 3 9 1 7 + ´ x x m o 1 9 Unique Games • Linear equations mod p • 2 variables per equation • maximize number of satisfied constraints • In every constraint, for every value of one variable, unique value of other variable satisfies the constraint. • If 99% of equations are satisfiable, can we satisfy 1% of them ?

  28. Unique Games Conjecture • [Khot ’02]Given a Unique Games instance where 1-fraction of constraints is satisfiable, it is NP-hard to satisfy evenfraction of all constraints. (for every constant positive  and  and sufficiently large domain size k).

  29. Implications of UGC • 2 is best possible for Vertex cover [KR ’03] • 0.878 is best possible for Max Cut [KKMO ’04] [MOO ’05] • (1) for sparsest cut(1) for min 2CNF deletion[CKKRS ’05] [KV ’05]

  30. Algorithms for Unique Games • Domain size k, OPT = 1- • Random solution satisfies 1/k • Non-trivial results only for  = 1/poly(k)[AEH ’01] [Khot ’02] [Trevisan ’05] [GT ’06] 1 -   0 1

  31. ³ ´ p l k O 1 ¡ " o g " ¡ ¡ ¢ k ­ ¡ 2 " Algorithms for Unique Games • [CMM ’05] • Given an instance where 1-fraction of constraints is satisfiable, we satisfy • We can also satisfy:

  32. ³ ´ p l k O 1 ¡ " o g = l k 1 1 0 c o g = 2 1 5 ( ) k O 1 = ( ) ¡ 2 1 ¡ ¡ ¡ " " " k k Algorithms for Unique Games • Algorithms cover the entire range of .

  33. Seems distant from UGC setting • Optimal if UGC is true ![KKMO ’05] [MOO ’05] • Any improvement will disprove UGC 1 -   0 1

  34. p h l k G i t t > ( ¢ v e n a g u o g " ¡ k ¡ 2 " p [ ] h l k ? P i t > ¢ w a s r g v o g p ( ) l k O 1 ¡ " o g Matching upper and lower bounds ? g Gaussian random vector v u u · v = 1 

  35. If pigs could whistle … • UGC seems to predict limitations of SDPs correctly • UGC based hardness for many problems matching best SDP based approximation • UGC inspired constructions of gap examples for SDPs • Disproof of Goemans-Linial conjecturel22 metrics do not embed into l1 with constant distortion. [KV ’05]

  36. Is UGC true ? • Points to limitations of current techniques • Focuses attention on common hard core of several important optimization problems • Motivates development of new techniques

  37. Approaches to disproving UGC • Focus on possibly easier problems • Max Cut: • OPT = 1-, beat 1-1/2[GW ‘94] • Max k-CSP: • constraints are ANDs of k literals • maximize #satisfied constraints • Beat k/2k [ST ‘06] [CMM ‘07] • Distinguish between 1/k and 1/2k satisfiable

  38. Approaches to disproving UGC • Systematic procedures to strengthen relaxations • Lift-and-project for SDPs • Lovasz-Schrijver, Sherali-Adams, Lasserre • Simulate products of k variables • Can we use them ?

  39. Lift-and-project • How good/bad are solutions obtained from lift-and-project ? • limited algorithmic success so far • clever constructions to show limitations of lift-and-project. • Connections to local-global phenomena • If every subset of size k has a nice property, does property hold globally ?

  40. Moment matrices • SDP solution gives covariance matrix M • There exist normal random variables with covariances Mij • Basis for SDP rounding algorithms • There exist {+1,-1} random variables with covariances Mij/log n • Is something similar possible for higher order moment matrices ?

  41. Concluding thoughts • Fascinating questions • Algorithms require geometric insights • Is the geometry intrinsic to these problems ? • Many mysterious connections and unsolved problems

More Related