1 / 11

Lecture 9: Fast Dimension Reduction Sketching

Lecture 9: Fast Dimension Reduction Sketching. Plan. PS2 due tomorrow, 7pm My office hours after class Fast Dimension Reduction Sketching Scriber? Due on Fri eve. Johnson Lindenstrauss Lemma. with probability for Time to compute : F aster? time ? Will show: time.

mhunter
Télécharger la présentation

Lecture 9: Fast Dimension Reduction Sketching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 9:Fast Dimension ReductionSketching

  2. Plan • PS2 due tomorrow, 7pm • My office hours after class • Fast Dimension Reduction • Sketching • Scriber? • Due on Fri eve

  3. Johnson Lindenstrauss Lemma • with probability • for • Time to compute : • Faster? • time ? • Will show: time

  4. Fast JL Transform • Costly because is dense • Meta-approach: use matrix ? • Suppose sample entries/row • Analysis of one row: • s.t. with probability • Expectation of : • What about variance? Set normalization constant

  5. Fast JLT: sparse projection • Variance of can be large  • Bad case: is sparse • think: • Even for (each coordinate of goes somewhere) • two coordinates collide (bad) with probability ~ • want exponential in failure probability • really would need • But, take away: may work if is “spread around” • New plan: • “spread around” • use sparse

  6. FJLT: construction • = matrix with random on diagonal • = Hadamardmatrix (Fourier transform) • A non-trivial rotation • can be computed in time • = projection matrix: sparse matrix as before, with size , with “spreading around” Projection: sparse matrix Hadamard (Fast Fourier Transform) Diagonal

  7. Spreading around: intuition • Idea for Hadamard/Fourier Transform: • “Uncertainty principle”: if the original is sparse, then the transform is dense! • Though can “break” ’s that are already dense “spreading around” Projection: sparse matrix Hadamard (Fast Fourier Transform) Diagonal composed of

  8. Spreading around: proof • Suppose • Without loss of generality since the map is linear! • Ideal spreading around: • would like , and • for all • Lemma: with probability at least , for each coordinate • Proof: • where is a random vector, times ! • as mentioned before, “behaves like” , for Gaussian (needs proof: at the end of the lecture if time permits) • Hence

  9. Why projection ? • Why aren’t we done? • choose first few coordinates of ? • each has same distribution: • Roughly gaussian • Issue: • are not independent! • Nevertheless: • since is a change of basis (rotation in )

  10. Projection • So far: • with probability • Or: with probability • = projection onto just random coordinates! • Proof: standard concentration • Chernoff: enough to sample terms for approximation • Hence suffices

  11. FJLT: wrap-up • Obtain: • with probability • dimension of is • time: • Dimension not optimal: • apply regular (dense) JL on • to reduce further to • Final time:

More Related