500 likes | 1.48k Vues
Chapter 9 Eigenvalue, Diagonalization, and Special Matrices. <Definition>. A real or complex number is an eigenvalue of A if there is a nonzero nx1 matrix (vector) E such that
E N D
Chapter 9 Eigenvalue, Diagonalization, and Special Matrices <Definition> A real or complex number is an eigenvalue of A if there is a nonzero nx1 matrix (vector) E such that Any nonzero vector E satisfying this relationship is called eigenvector associated with the eigenvalue . <Definition> Characteristic Polynomial The polynomial is the characteristic polynomial of A, and is denoted <Theorem> • Let A be an nxn matrix of real or complex numbers. Then • is an eigenvalue of A if and only if • If is an eigenvlaue of A, then any non-trivial solution of • is an associated
Find eigenvalues and eigenvectors for A Step 1: Use to find the eigenvalues Step 2: For each , use to find the solution is the eigenvector associated with the eigenvalue Examples: <Definition> Diagonal Matrix (對角矩陣) d1, d2 … dn are main diagonal elements All off-diagonal elements = 0
Theorem 2. 1. 3. D is nonsingular if and only if each main diagonal element is nonzero. 4. If each , then
Theorem 5. The eigenvalues of D are its main diagonal elements. 6. An eigenvector associated with is row j <Definition> Diagonalizable Matrix An nxn matrix A is diagonalizable if there exists an nxn matrix P such that is a diagonal matrix. When P exists, we say that P diagonalizes A.
Theorem A is diagonalizable if it has n linearly independent eigenvectors. Further, if P is the nxn matrix having the eigenvectors as columns, then is the diagonal matrix having the corresponding eigenvalues down its main diagonal. Steps: 1. From the nxn matrix A, find its eigenvalues and the associated eigenvectors 2. Use eigenvectors as columns of P, i.e., 3. Compute the inverse of P, i.e., 4. Check that P diagonalizes A, i.e.,
Theorem If a nxn matrix A has n linearly independent eigenvectors, , then A is diagonalizable. , then , then exist, and (因為所有columns線性獨立,無法互相取代,使 P的reduced matrix = In P的 inverse 存在) Theorem If A is diagonalizable exist A has n linearly independent eigenvectors
Theorem Let A be an nxn diagonalizable matrix. Then A has n linearly independent eigenvectors. Further, if is a diagonal matrix, then the diagonal elements of are the eigenvalues of A, and the columns of Q are corresponding eigenvectors. Proof. Examples: Theorem Let the nxn matrix A has n distinct eigenvalues. Then the corresponding eigenvectors are linearly independent. Proof. Examples: • Distinct eigenvalues (n個數值都不一樣的eigenvalues) • eigenvectors are linearly independent • exist A is diagonalizable.
Orthogonal and Symmetric Matrices <Definition> Orthogonal Matrix A square matrix A is orthogonal if and only if 亦即 A與 At互為 inverse, , At亦是 orthogonal matrix Theorem A is an orthogonal matrix if and only if At is an orthogonal matrix. Theorem If A is an orthogonal matrix, then Proof:
Theorem • Let A be a real nxn matrix. Then • A is orthogonal if and only if the row vectors form an orthogonal set of vectors in Rn. • A is orthogonal if and only if the column vectors form an orthogonal set of vectors in Rn. • Proof. Example: 2x2 Orthogonal Matrix For For a rotation matrix with ϴ
<Definition> Symmetric Matrix A square matrix A is symmetric if . Example: Theorem The eigenvalues of a real, symmetric matrix are real numbers. Proof: Theorem Let A be a real symmetric matrix. Then eigenvectors associated with distinct eigenvalues are orthogonal. Proof: Example: Theorem Let A be a real symmetric matrix. There is a real, orthogonal matrix that diagonalizes A.
Unitary, Hermitian, and Skew-Hermitian Matrices <Definition> ● An nxn complex matrix U is unitary if and only if ● An nxn complex matrix H is hermitian if and only if ● An nxn complex matrix U is skew-hermitian if and only if