1 / 31

Lower Bounds for NNS and Metric Expansion

Lower Bounds for NNS and Metric Expansion. Rina Panigrahy Kunal Talwar Udi Wieder Microsoft Research SVC. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A. Nearest Neighbor Search. Given points in a metric space

cooper
Télécharger la présentation

Lower Bounds for NNS and Metric Expansion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lower Bounds for NNS and Metric Expansion RinaPanigrahy KunalTalwar UdiWieder Microsoft Research SVC TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA

  2. Nearest Neighbor Search Given points in a metric space Preprocess into a small data structure Given a query point Quickly retrieve the closest to Many Applications

  3. Decision Version. Given search radius r • Find a point in distance r of query point • Relation to Approximate NNS: • If second neighbor is at distance cr • Then this is also a c-approximate NN r cr

  4. Cell Probe Model Preprocess into data structure with • words • bits per word Query algorithm gets charged t if it probes words of • All computation is free Study tradeoff between and In this talk w m

  5. Many different lower bounds n.exp(ϵ3 d)

  6. Lower bounds from Expansion Show a unified approach for proving cell probe lower bounds for near neighbor and other similar problems. Show that all lower bounds stem from the same combinatorial property of the metric space Expansion : |number of points near A|/|A| (show some new lower bounds)

  7. Graphical Nearest Neighbor • Convert metric space to Graph • Place an edge if nodes are within distance r • Return a neighbor of the query. Now r=1

  8. Graphical Nearest Neighbor • Assume uniform degree • Use a random data set • Assume W.h.p the n balls are disjoint.

  9. Deterministic Bounds via Expansion

  10. Deterministic Bound • sdddddddddddddddlklkj

  11. Example Application n.exp(ϵ2d)

  12. Proof Idea when t=1 Shattering • F : V → [m] partitions V into m regions • Split large regions • A random ball is shattered into many parts: about ф(G) • ф(G) replication in space

  13. Proof Idea when t=1 • determines which cell in is read • Select a fraction of cells such • it is likely that cantains a quarter of the data set points • So, and

  14. Generalizing for larger t • Select a fraction of each table such • Continue as before • Non adaptive algorithms • Adaptive alg. depend upon content of selected cells • Subexp. number of algs • Union bound

  15. Randomized Bounds • So far we assumed the algorithm is correct on • What if only of are good query point? Need to relax the definition of vertex expansion

  16. Randomized Bounds • Robust Expansion • N(A) captures all edges from A • Expansion =|N(A)|/|A| • Capture only ¾ of the edges from A A N(A)

  17. Robust Exapnsion • Small set vertex expansion: • In other words:We can cover all the edges incident on with a set of size • We can cover of the edges incident on with a set of size • Robust expansion is at least the edge expansion

  18. Bound for Randomized Data Structure • Theorem: if is weakly Independent, then a randomized data structure that answers GNS queries with space and queries must satisfy and

  19. Proof Idea when t=1 Shattering • Most of a random ball is shattered into many parts: about фr • фr replication in space

  20. Generalizing for larger t • Sample 1/фr1/t fraction from each table. • A random ball, good part survives in all tables. • Union bound for adaptive is trickier.

  21. Applications • We know how to calculate robust expansion of graphs derived from: • when (known) • when (new) • when (natural input dist.) • Don’t know the robust expansion of: • when

  22. General Upper Bound • Say is a Cayley Graph • Take • Take with r.e. • Use random translations of to define the access function • For rand. input success prob. is constant

  23. Conclusions and Open Problems Unified approach to NNS cell probe lower bounds • often characterized by expansion • Average case with natural distributions • Higher lower bounds? • Improve dependency on (very hard) • Dynamic NNS, tight bound for special cases shown in the paper

  24. Approximate Near Neighbor Search • sdfsdfsffjlaskdjffj

  25. gdgsgsdfgdfffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffkffffsdfgddddddjffjdfgdfggdgsgsdfgdfffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffkffffsdfgddddddjffjdfgdfg

  26. Graphical Nearest Neighbor

  27. Randomized Bounds • So far we assumed the algorithm is correct on • What if only of are good query point? Need to relax the definition of vertex expansion and independence is weakly independent if for random it holds that

  28. Deterministic Bounds via Expansion

  29. Proof Idea • Can we plug the new definitions in the old proof? • Conceptually – yes! • Actually….well no • Dependencies everywhere – the set of good neighbors of a data point depends upon the rest of the data set • Solving this is the technical crux of the paper

More Related