1 / 34

Manifold learning and pattern matching with entropic graphs

Manifold learning and pattern matching with entropic graphs. Alfred O. Hero Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan - Ann Arbor hero@eecs.umich.edu http://www.eecs.umich.edu/~hero. Multimodality Face Matching. Clustering Gene Microarray Data.

peony
Télécharger la présentation

Manifold learning and pattern matching with entropic graphs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Manifold learning and pattern matching with entropic graphs Alfred O. Hero Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan - Ann Arbor hero@eecs.umich.edu http://www.eecs.umich.edu/~hero

  2. Multimodality Face Matching

  3. Clustering Gene Microarray Data Cy5/Cy3 hybridization profiles

  4. Image Registration

  5. Vehicle Classification • 128x128 images of three vehicles over 1 deg increments of 360 deg azimuth at 0 deg elevation • The 3(360)=1080 images evolve on a lower dimensional imbedded manifold in R^(16384) HMMV Truck T62 Courtesy of Center for Imaging Science, JHU

  6. Image Manifold

  7. What is manifold learning good for? • Interpreting high dimensional data • Discovery and exploitation of lower dimensional structure • Deducing non-linear dependencies between populations • Improving detection and classification performance • Improving image compression performance

  8. Random Sampling on a Manifold

  9. Classifying on a Manifold Class A Class B

  10. Background on Manifold Learning • Manifold intrinsic dimension estimation • Local KLE, Fukunaga, Olsen (1971) • Nearest neighbor algorithm, Pettis, Bailey, Jain, Dubes (1971) • Fractal measures, Camastra and Vinciarelli (2002) • Packing numbers, Kegl (2002) • Manifold Reconstruction • Isomap-MDS, Tenenbaum, de Silva, Langford (2000) • Locally Linear Embeddings (LLE), Roweiss, Saul (2000) • Laplacian eigenmaps (LE), Belkin, Niyogi (2002) • Hessian eigenmaps (HE), Grimes, Donoho (2003) • Characterization of sampling distributions on manifolds • Statistics of directional data, Watson (1956), Mardia (1972) • Statistics of shape, Kendall (1984), Kent, Mardia (2001) • Data compression on 3D surfaces, Kolarov, Lynch (1997)

  11. Sampling on a Domain Manifold Assumption: is a conformal mapping 2D manifold Embedding Sampling distribution Sampling A statistical sample

  12. Alpha-Entropy and Divergence • Alpha-entropy • Alpha-divergence • Other alpha-dissimilarity measures • Alpha-Jensen difference • Alpha geometric-arithmetic (GA) divergence

  13. MST and Geodesic MST • For a set of points in d-dimensional Euclidean space, the Euclidean MST with edge power weighting gamma is defined as • edge lengths of a spanning tree over • pairwise distance matrix of complete graph • When the matrix is constructed from geodesic distances between points on , e.g. using ISOMAP, we obtain the Geodesic MST

  14. A Planar Sample and its Euclidean MST

  15. Convergence of Euclidean MST Beardwood, Halton, Hammersley Theorem:

  16. Key Result for GMST Ref: Costa&Hero:TSP2003

  17. Special Cases • Isometric embedding (ISOMAP) • Conformal embedding (C-ISOMAP)

  18. Remarks • Result holds for many other combinatorial optimization algorithms (Costa&Hero:2003) • K-NNG • Steiner trees • Minimal matchings • Traveling Salesman Tours • a.s. convergence rates (Hero&etal:2002) • For isometric embeddings Jacobian does not have to be estimated for dimension estimation

  19. Joint Estimation Algorithm • Assume large-n log-affine model • Use bootstrap resampling to estimate mean MST length and apply LS to jointly estimate slope and intercept from sequence • Extract d and H from slope and intercept

  20. Random Samples on a Swiss Roll • Ref: Grimes and Donoho (2003)

  21. Bootstrap Estimates of GMST Length

  22. loglogLinear Fit to GMST Length

  23. Dimension and Entropy Estimates • From LS fit find: • Intrinsic dimension estimate • Alpha-entropy estimate (nats)

  24. Dimension Estimation Comparisons

  25. Practical Application • Yale face database 2 • Photographic folios of many people’s faces • Each face folio contains images at 585 different illumination/pose conditions • Subsampled to 64 by 64 pixels (4096 extrinsic dimensions) • Objective: determine intrinsic dimension and entropy of a face folio

  26. GMST for 3 Face Folios

  27. GMST for 3 Face Folios

  28. Yale Face Database Results • GMST LS estimation parameters • ISOMAP used to generate pairwise distance matrix • LS based on 25 resamplings over 26 largest folio sizes • To represent any folio we might hope to attain • factor > 600 reduction in degrees of freedom (dim) • only 1/10 bit per pixel for compression • a practical parameterization/encoder? Ref: Costa&Hero 2003

  29. Conclusions Advantages of Geodesic Entropic Graph Methods • Characterizing high dimension sampling distributions • Standard techniques (histogram, density estimation) fail due to curse of dimensionality • Entropic graphs can be used to construct consistent estimators of entropy and information divergence • Robustification to outliers via pruning • Manifold learning and model reduction • Standard techniques (LLE, MDS, LE, HE) rely on local linear fits • Entropic graph methods fit the manifold globally • Computational complexity is only n log n

  30. Summary of Algorithm • Run ISOMAP or C-ISOMAP algorithm to generate pairwise distance matrix on intrinsic domain of manifold • Build geodesic entropic graph from pairwise distance matrix • MST: consistent estimator of manifold dimension and process alpha-entropy • K-NNG: consistent estimator of information divergence between labeled vectors • Use bootstrap resampling and LS fitting to extract rate of convergence (intrinsic dimension) and convergence factor (entropy) of entropic graph

  31. Swiss Roll Example Uniform Samples on 3D Imbedding of Swiss Roll

  32. Geodesic Minimal Spanning Tree GMST over Uniform Samples on Swiss Roll

  33. Geodesic MST on Imbedded Mixture GMST on Gaussian Samples on Swiss Roll

  34. Classifying on a Manifold Class A Class B

More Related