1 / 17

volume distortion for subsets of R n

volume distortion for subsets of R n. James R. Lee. Institute for Advanced Study & University of Washington. Symposium on Computational Geometry, 2006; Sedona, AZ.  ( n ).  ( 1 ). j =  ( u ). k =  ( v ). Given a graph G=(V,E) , we seek a permutation  : V ! {1,2,...,n}.

yakov
Télécharger la présentation

volume distortion for subsets of R n

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. volume distortion for subsets of Rn James R. Lee Institute for Advanced Study & University of Washington Symposium on Computational Geometry, 2006; Sedona, AZ

  2. (n) (1) j=(u) k=(v) Given a graph G=(V,E), we seek a permutation  : V ! {1,2,...,n}. graph bandwidth The bandwidthof  is The bandwidthof G is bw(G) = min bw(). Goal: Efficiently compute an ordering  for which bw() ¼ bw(G).

  3. embeddings & projections 1) Embed G into a Euclidean space Rk (preserving distances) 2) Project onto a random line and take the induced linear ordering.

  4. embeddings & projections Analysis: Count the number of points that fall into an interval, and use this to upper bound the stretch. Problem: Only have control on the expectation, but there could be costly correlations...

  5. embeddings & projections Analysis: Count the number of points that fall into an interval, and use this to upper bound the stretch. Problem: Only have control on the expectation, but there could be costly correlations... Feige gave an example where this approach will take a graph of bandwidth 2, and with high probability yield a solution of bandwidth

  6. embeddings & projections Analysis: Count the number of points that fall into an interval, and use this to upper bound the stretch. Feige gave an example where this approach will take a graph of bandwidth 2, and with high probability yield a solution of bandwidth

  7. Volume of a set of points controls the probability they project close together. embeddings & projections Conditioned on the projection of the three red points, the projection of the blue point still varies proportional to the distance to the affine hull of the red points (false, but essentially true in high dimensions)

  8. volume distortion This leads to a new notion of embedding [Feige 00], one which tries to maximize the volume of e.g. all triangles in the image. non-expansive We would like to get as close as possible to the best possible volume for all the triples in our metric space (e.g. shortest path on our graph)

  9. Given a metric space (X,d), a number k, and a non-expansive mapping f : X !Rm, we define the k-dimensional distortion of f as the smallest number D such that... higher-dimensional distortion (note: 1-dimensional distortion recovers the “standard” notion)

  10. General n-point metric spaces previous results [Feige 97] [Rao 99] [Krauthgamer-L-Mendel-Naor 04] Subsets of Euclidean spaces(important to analyze bandwidth SDP) [Rao 99] [Krauthgamer-Linial-Magen 03 Dunagan-Vempala 01] (k=3) New results:

  11. Given an n-point subset X µRn, there exists a non-expansive mapping f : X !Rnsuch that... main result This embedding maximizes the volume of every k-point subset within factor ¼ (log n)k/2.

  12. Three phases construction of the embedding • Randomized reduction to a collection of polylog(n) easier problems • Random partitions, random sampling, gluing via smooth bump functions, • measured descent, ... [KLMN, L, ALN]

  13. construction of the embedding 2) Reduction to a continuous problem in the “right” dimension Dimension reduction (Johnson-Lindenstrauss) and Kirszbraun’s extension theorem Rk Lipschitz extension problem: Given S µ X and a non-expansive map f : S !Rk, does there exist an non-expansive extensionf : X !Rk ? Answer: Yes if X is a subset of Euclidean space

  14. kirszbraun’s theorem

  15. 3) Solution of the continuous problem construction of the embedding If you think of Fd(x) as being a real-valued random variable for every x, then we are saying that (standard deviation)

  16. If you think of Fd(x) as being a real-valued random variable for every x, then we are saying that construction of the embedding y x Q

  17. open problems – remove the O(log log n) terms (here and SparsestCut) / simplify analysis – improve the approximation ratio for bandwidth best known is ¼O(log n)3 [Feige, Dunagan-Vempala] best known for trees is O(log n)2.5 [Gupta] conjectured optimal bound: O(log n)

More Related