1 / 27

Linear Algebra Review

Linear Algebra Review. CS479/679 Pattern Recognition Dr. George Bebis. n-dimensional Vector. An n -dimensional vector v is denoted as follows: The transpose v T is denoted as follows:. Inner (or dot) product.

kiefer
Télécharger la présentation

Linear Algebra Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linear Algebra Review CS479/679 Pattern RecognitionDr. George Bebis

  2. n-dimensional Vector • An n-dimensional vector v is denoted as follows: • The transpose vTis denoted as follows:

  3. Inner (or dot) product • Given vT= (x1, x2, . . . , xn) and wT= (y1, y2, . . . , yn), their dot product defined as follows: (scalar) or

  4. k Orthogonal / Orthonormal vectors • A set of vectors x1, x2, . . . , xnis orthogonal if • A set of vectors x1, x2, . . . , xnis orthonormalif

  5. Linear combinations • A vector v is a linear combination of the vectors v1, ..., vk: where c1, ..., ck are scalars • Example: any vector in R3 can be expressed as a linear combinations of the unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)

  6. Space spanning • A set of vectors S=(v1, v2, . . . , vk ) span some space W if every vector in W can be written as a linear combination of the vectors in S -Example: the vectors i, j, and k span R3 w

  7. Linear dependence • A set of vectors v1, ..., vkare linearly dependent if at least one of them is a linear combination of the others. (i.e., vj does not appear at the right side)

  8. Linear independence • A set of vectors v1, ..., vkis linearly independent if Example:

  9. Vector basis • A set of vectors (v1, ..., vk) is said to be a basis for a vector space W if (1) (v1, ..., vk)are linearly independent (2) (v1, ..., vk)span W • Standard bases: R2 R3 Rn

  10. Matrix Operations • Matrix addition/subtraction • Matrices must be of same size. • Matrix multiplication m x n q x p m x p Condition: n = q

  11. Identity Matrix

  12. Matrix Transpose

  13. Symmetric Matrices Example:

  14. Determinants 2 x 2 3 x 3 n x n

  15. Matrix Inverse • The inverse A-1 of a matrix A has the property: AA-1=A-1A=I • A-1 exists only if • Terminology • Singular matrix:A-1does not exist • Ill-conditioned matrix: A is close to being singular

  16. Matrix Inverse (cont’d) • Properties of the inverse:

  17. Matrix trace properties:

  18. Rank of matrix • Equal to the dimension of the largest square sub-matrix of A that has a non-zero determinant. Example: has rank 3

  19. Rank of matrix (cont’d) • Alternative definition: the maximum number of linearly independent columns (or rows) of A. Therefore, rank is not 4 ! Example:

  20. Eigenvalues and Eigenvectors • The vector vis an eigenvector of matrix A and λ is an eigenvalue of A if: • Interpretation: the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude. (assume non-zero v)

  21. Computing λ and v • To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial: Example:

  22. Properties • Eigenvalues and eigenvectors are only defined for square matrices (i.e., m = n) • Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv) • Suppose λ1, λ2, ..., λnare the eigenvalues of A, then:

  23. Properties (cont’d) If A has n distinct eigenvalues λ1, λ2, ..., λn, then the corresponding eigevectorsv1,v2 ,. . . vnform a basis: (1) linearly independent (2) span Rn

  24. Matrix diagonalization • Given A, find P such thatP-1APis diagonal (i.e., P diagonalizes A) • Take P = [v1v2 . . . vn], where v1,v2 ,. . . vnare the eigenvectors of A:

  25. Matrix diagonalization (cont’d) Example:

  26. Matrix decomposition • Let us assume that A is diagonalizable, then:

  27. Decomposition of symmetric matrices • The eigenvalues of symmetric matrices are all real. • The eigenvectors corresponding to distinct eigenvalues are orthogonal. P-1=PT A=PDPT=

More Related