1 / 33

Spectral Methods

Spectral Methods. Tutorial 6. © Maks Ovsjanikov tosca.cs.technion.ac.il/book. Numerical geometry of non-rigid shapes Stanford University, Winter 2009. Outline.

gur
Télécharger la présentation

Spectral Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009

  2. Outline On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS and PCA review. Metric MDS. Kernel PCA, kernel trick, relation to Metric MDS. Summary. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

  3. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling) recap. Given a dissimilarity matrix arising from a normed vector space: We want to find the coordinates of points that would give rise to E.g. given pairwise distances between cities on a map, find the locations: Can only hope to find up to rotation, translation

  4. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling). Centering matrix H: Define , where Attention: Only works for normed vector spaces!

  5. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 • Classic MDS (classical scaling). • Define , • Express to obtain • Note that if , then for any orthonormal • Since is symmetric, can find its eigendecomposition: • and

  6. Multivariate Analysis Mardia K.V. et al., Academic Press., 1979 Classic MDS (classical scaling). Although is a matrix, it has only non-zero eigenvalues if was sampled from . Can project on the first eigenvectors, by taking:

  7. Multivariate Analysis Mardia K.V. et al., Academic Press., 1979 Classic MDS (classical scaling). Although is a matrix, it has only non-zero eigenvalues if was sampled from . Can project on the first eigenvectors, by taking: • Optimality condition of classic MDS • Theorem: If is a set of points in with distances: • For any k-dimensional orthonormal projection , the distortion • is minimized when is projected onto its principal directions,

  8. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.

  9. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.

  10. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.

  11. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Using the centering matrix, we can express: For any eigenvalue of we have: which implies: The eigenvalues of and are the same and the eigenvectors are given by

  12. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. The eigenvalues of and are the same and the eigenvectors are given by: has the advantage that its size is and it is positive definite rather than positive-semidefinite. Eigendecomposition more stable. If we’re only given pairwise distances, cannot construct directly. Solving different problems!

  13. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 • Metric MDS. • Suppose instead of minimizing distortion (stress), we want to minimize derived stress. Given pairwise distances , find a set of points to minimize: • Even if come from a Euclidean space, the problem is much more difficult. • Resort to numerical optimization. Differentiate w.r.t. to to get the gradient. • Alternative: perform classical MDS on derived distances. Eigensystem. • Problem: The matrix is no longer guaranteed to be positive semi-definite. Critchley F., Multidimensional Scaling: a short critique and a new algorithm, COMPSTAT, 1978

  14. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Basic Idea: represent a point by its image in a feature space: Domains can be completely different! Kernel Trick: In many applications we do not need to know explicitly, we only need to operate if the kernel can be computed efficiently (e.g. can be infinite dimensional)

  15. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Basic Idea: represent a point by its image in a feature space: Domains can be completely different! Kernel Trick: In many applications we do not need to know explicitly, we only need to operate if the kernel can be computed efficiently (e.g. can be infinite dimensional)

  16. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Could do PCA in the feature space: compute covariance matrix of feature vectors, and perform its eigen-decomposition. However, instead of , could use If the dimension of feature vectors > , this is more efficient! To center the data, so that can use the centering matrix and find eigenvalues of Schölkopf, B., et al., Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 1998

  17. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA and Metric MDS. Spherical (isotropic) kernel. Depends only on the distance between points: If we assume that then:

  18. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 • Kernel PCA and Metric MDS. • Suppose we’re given a matrix of pairwise distances: • If we set then • In matrix form: , and moreover: • Thus, performing Classical MDS on is equivalent to performing it on A. • Classical MDS on attempts to approximate:which is a nonlinear function of distance. So classical MDS on is metric MDS on

  19. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA and Metric MDS. Thus, performing Classical MDS on is equivalent to performing it on A. Classical MDS on attempts to approximate:which is a nonlinear function of distance. Classical MDS on is metric MDS on . Since , it is positive semi-definite if the kernel is chosen appropriately. This is not the case for arbitrary Metric MDS functions. An advantage of doing Kernel PCA is that a new point can be quickly projected onto a pre-computed basis. Difficult with numerical optimization.

  20. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Summary: If the distance matrix comes from points in a normed vector space, MDS reduces to an Eigenvalue Problem – classical scaling. This classical MDS is also closely related to PCA, which computes the optimal basis when positions are known. Kernel PCA transforms points to a feature space and uses the kernel trick to compute PCA in this space. Metric MDS approximates derived distances , for some given function . If the kernel is spherical, then Kernel PCA is a special case of Metric MDS, for the function

  21. Outline On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS and PCA review. Metric MDS. Kernel PCA, kernel trick, relation to Metric MDS. Summary. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

  22. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Problem: Given 2 articulated shapes in different poses, find point correspondences : Many degrees of freedom, cannot apply rigid alignment. Images by Q.-X. Huang et al. 08

  23. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Approach: Embed each shape into a feature space, defined by the Laplacian. The embedding is isometry invariant: for any isometric deformation . The embedding is only defined up to a rigid transform in the feature space. Find the optimal rigid transform in the feature space to find the correspondences.

  24. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 • Approach: • The shape is defined as a point cloud. Approximate the Laplacian: • Solve the generalized eigenvalue problem: • Find the most significant eigenvalues/vectors. • For each data point , let • Where is the i-th eigenvector of

  25. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 • Approach: • For each data point , let • Where is the i-th eigenvector of . • Would like to have for corresponding points. However, each eigenvector is only defined up to a sign. Reflection: • If correspond to the same eigenvalue, then for any • is also an eigenvector. Rotation: • Points from the two point sets can be aligned using: • where is orthogonal.

  26. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Approach: Given point correspondences it is easy to obtain the optimal orthogonal matrix: SVD approach from optimal rigid alignment. Let , and compute its singular value decomposition: The optimal solution is given by: With this step, can perform ICP in the feature space to find the optimal correspondeces.

  27. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Results:

  28. Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Main Goal:Find a good, isometry-invariant shape descriptor. Good: Efficient, Easily Computable, Insensitive to local topology changes (unlike MDS)

  29. Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Main Idea: For every point define a Global Point Signature Where is an eigenvector of the Laplace-Beltramioperator. GPS is a mapping of the surface onto an infinite dimensional space. Each point gets a signature.

  30. Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 • Properties of GPS: • If . • GPS is isometry invariant (since Laplace-Beltrami is) • Given all eigenfunctions and eigenvalues, can recover the shape up to isometry (not true if only eigenvalues are known). • Euclidean distances in the GPS embedding are meaningful: • K-means done on the embedding provides a segmentation.

  31. Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 • Comparing GPS: • Given a shape, determine its GPS embedding. • Construct a histogram of pairwise GPS distances (note that GPS is defined up to sign flips, distances are preserved) • For any 2 shapes, compute the -norm difference between their histograms. • For refined comparisons use more than one histogram.

  32. Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 • Results:

  33. Conclusions • Kernel methods attempt to embed the shape into a feature space, that can be manipulated more easily. • Laplacian embedding is useful because of its isometry-invariance. Can be used for comparing non-rigid shapes under isometric deformations. • Sign flipping and repeated eigenvalues can cause difficulties (no canonical way to chose them). • Limitations: • Embeddings are not necessarily stable or mesh independent. • Difficult to compute for large meshes (millions of points) • Both topological and geometric stability is not well understood.

More Related