1 / 25

EE 290A: Generalized Principal Component Analysis

EE 290A: Generalized Principal Component Analysis. Lecture 5: Generalized Principal Component Analysis. Last time. GPCA: Problem definition Segmentation of multiple hyperplanes. Recover subspaces from vanishing polynomial. This Lecture.

anisa
Télécharger la présentation

EE 290A: Generalized Principal Component Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EE 290A: Generalized Principal Component Analysis Lecture 5: Generalized Principal Component Analysis EE 290A, University of California, Berkeley

  2. Last time • GPCA: Problem definition • Segmentation of multiple hyperplanes EE 290A, University of California, Berkeley

  3. Recover subspaces from vanishing polynomial EE 290A, University of California, Berkeley

  4. EE 290A, University of California, Berkeley

  5. EE 290A, University of California, Berkeley

  6. This Lecture • Segmentation of general subspace arrangements knowing the number of subspaces • Subspace segmentation without knowing the number of subspaces EE 290A, University of California, Berkeley

  7. An Introductory Example EE 290A, University of California, Berkeley

  8. Make use of the vanishing polynomials EE 290A, University of California, Berkeley

  9. Recover Mixture Subspace Models EE 290A, University of California, Berkeley

  10. Question: How to choose one representative point per subspace? (some loose answers) • In noise-free case, randomly pick one. • In noisy case, choose one close to the zero set of vanishing polynomials. (How?) EE 290A, University of California, Berkeley

  11. Summary • Using the vanishing polynomials, GPCA converts a CAE problem to a closed-form solution. EE 290A, University of California, Berkeley

  12. Step 1: Fitting Polynomials • In general, when the dimensions of subspaces are mixed, the set of all K-th degree polynomials that vanish on A becomes more complicated. EE 290A, University of California, Berkeley

  13. Polynomials may be dependent! EE 290A, University of California, Berkeley

  14. With the closed-form solution, even when the sample data are noisy, if K and subspace dimensions are known, a complete list of linearly independent vanishing polynomials can be recovered from the (null space of) embedded data matrix! EE 290A, University of California, Berkeley

  15. Step 2: Polynomial Differentiation EE 290A, University of California, Berkeley

  16. Step 3: Sample Point Selection • Given n sample points from K subspaces, how to choose one point per subspace to evaluate the orthonormal basis for each subspace? • What is the notion of optimality in choosing the best sample when a set of vanishing polynomials is given (for any algebraic set)? EE 290A, University of California, Berkeley

  17. In the case of segmenting hyperplanes? EE 290A, University of California, Berkeley

  18. Draw a random line that does not pass the origin EE 290A, University of California, Berkeley

  19. Lemma 3.9: For general arrangements • We shall choose samples as close to the zero set as possible (in the presence of noise) • One shall avoid choosing points based on P(x), as it is merely an algebraic error, not the geometric distance. • One shall discourage choosing points close to the intersection of two ore more subspaces, even when P(x)=0. EE 290A, University of California, Berkeley

  20. EE 290A, University of California, Berkeley

  21. Estimate the Rest (K-1) Subspaces • Polynomial division EE 290A, University of California, Berkeley

  22. EE 290A, University of California, Berkeley

  23. GPCA without knowing K or d’s • Determining K and d’s is straightforward when subspaces are of equal dimension • If d is known, project samples to (d+1)-dim space. The problem becomes hyperplane segmentation. • If K is known, project samples to l-dim spaces, while l=1, 2, …, computek-th order Veronese map until it drops rank. • If both K and d are unknown, try all the combinations EE 290A, University of California, Berkeley

  24. GPCA without knowing K or d’s • Determine arrangements of different dimensions 1. If data are noise-free, check the Hilbert function table. EE 290A, University of California, Berkeley

  25. 2. When the data are noisy, apply GPCA recursively Please read Section 3.5 for the definition of Effective Dimension EE 290A, University of California, Berkeley

More Related