220 likes | 410 Vues
Lecture 20 SVD and Its Applications. Shang-Hua Teng. Spectral Theorem and Spectral Decomposition. Every symmetric matrix A can be written as. where x 1 … x n are the n orthonormal eigenvectors of A, they are the principal axis of A. x i x i T is the projection matrix on to x i !!!!!.
E N D
Lecture 20SVD and Its Applications Shang-Hua Teng
Spectral Theorem and Spectral Decomposition Every symmetric matrix A can be written as where x1 …xn are the n orthonormal eigenvectors of A, they are the principal axis of A. xi xiT is the projection matrix on to xi !!!!!
Singular Value Decomposition • Any m by n matrix A may be factored such that A = UVT • U: m by m, orthogonal, columns • V: n by n, orthogonal, columns • : m by n, diagonal, r singular values
· · 0 A U S VT = 0 m x n m x m m x n n x n The Singular Value Decomposition r = the rank of A = number of linearly independent columns/rows
SVD Properties • U, V give us orthonormal bases for the subspaces of A: • 1st r columns of U:Column space of A • Last m - r columns of U: Left nullspace of A • 1st r columns of V: Row space of A • 1st n - r columns of V: Nullspace of A • IMPLICATION: Rank(A) = r
· · The Singular Value Decomposition 0 A U S VT = 0 m x n m x m m x n n x n 0 A U S VT 0 = m x n m x r r x r r x n
Singular Value Decomposition • where • u1 …ur are the r orthonormal vectors that are basis of C(A) and • v1 …vr are the r orthonormal vectors that are basis of C(AT )
SVD Proof • Any m x n matrix A has two symmetric covariant matrices (m x m) AAT (n x n) ATA
Spectral Decomposition of Covariant Matrices • (m x m) AAT =U L1 UT • U is call the left singular vectors of A • (n x n)ATA = V L2 VT • V is call the right singular vectors of A • Claim: are the same
Row and Column Space Projection • Suppose A is an m by n matrix that has rank r and r << n, and r << m. • Then A has r non-zero singular values • Let A = U S VT be the SVD of A where S is an r by r diagonal matrix • Examine:
· The Singular Value Projection 0 A U S VT 0 = m x n m x r r x r r x n
Therefore • Rows of U S are r dimensional projections of rows of A • Columns of SVT are r dimensional projections of columns of A • So we can compute their distances or dot products in a lower dimensional space
Eigenvalues and Determinants • Product law: • Summation Law: Both can be proved by examining the characteristic polynomial
Eigenvalues and Pivots If A is symmetric the number of positive (negative) eigenvalues equals to the number of positive (negative) pivots A = LDL T Topological Proof: scale down the off-diagonal entries of L continuously to 0, i.e., moving L continuously to I. Any change sign in eigenvalue must cross 0
Next Lecture • Dimensional reduction for Latent Semantic Analysis • Eigenvalue Problems in Web Analysis