1 / 7

Chapter 1

Chapter 1. Section 1.7 Linear Independence and Nonsingular Matrices. Zero Vector in A vector in in which every entry is zero is called the zero vector it is denoted  (or ). Linearly Independent Vectors

patia
Télécharger la présentation

Chapter 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 1 Section 1.7 Linear Independence and Nonsingular Matrices

  2. Zero Vector in A vector in in which every entry is zero is called the zero vector it is denoted  (or ) Linearly Independent Vectors A set of vectors in is said to be linearly independent if the only solution the homogeneous system of equations given by: is . Vectors that are not linearly independent (i.e. it is possible to find a nonzero solution) are said to be linearly dependent. Recall a homogeneous system of equations always has the zero solution, but it will have infinitely many (hence a nonzero solution) if one of the variables is an independent variable. If an independent variable exists the vectors are linearly dependent and if the system has no independent variable the system is linearly independent. If a set of vectors is linearly dependent it is possible to express one of the vectors as a linear combination of the others (i.e. one of the vectors gives you no new algebraic information).

  3. Example Determine if the following sets of vectors are linearly dependent or independent. If they are linearly dependent show how one is a linear combination of the others. Row Reduce Form the Matrix Set of Vectors Form the System This system has 3 dependent variables and no independent variables (i.e. ) therefore is linearly independent. Row Reduce Set of Vectors Form the System & Matrix General Solution This system has the independent variables and so the vectors are linearly dependent. To write one in terms of the other pick a particular solution.

  4. Without doing any calculations why is the set of vectors to the right linearly dependent? The set contains the zero vector. If the zero vector is in the set have the variable corresponding to the zero vector be 1 and all the others be zero and this will be a non-zero solution to the system. Without doing any calculations why is the set of vectors to the right linearly dependent? The corresponding system of equations can have at most 3 dependent variables so that means the must be at least 1 independent variable. Whenever you have more variables (vectors) than equations (entries in a vector) the set must be linearly dependent. It is useful to notice these details in order to avoid either a great deal of calculation by hand or a bunch of unnecessary typing into your calculator or computer!

  5. Unit Vectors in The unit vectors in are where is a vector with n entries with a 1 in the ith position and 0 for all the other entries. Nonsingular Matrix If then Nonsingular and Singular Matrices A square matrix A is call nonsingular if the only solution to the matrix equation is for (i.e. x is the zero vector). If a nonzero solution exists for x we call the matrix Asingular. Again this represents a homogeneous system of equations The only way to have a nonzero solution is have an independent variable in the general solution. Row reduces to the matrix Nonsingular Row reduces to the matrix Singular If a square matrix can be row reduced to the identity matrix then the matrix is nonsingular. If the row reduced matrix has a row of all zeros it is singular.

  6. Nonsingular Matrices and Linear Independence A square matrix is nonsingular if and only if the set column vectors of the matrix A form a linearly independent set. Example: Find all values for a so that the set of vectors is linearly dependent. We begin by forming the matrix with the given set of vectors as columns and row reduce it. R1↔R2 ½R1 -aR1+R2 There will be an independent variable for this system of equations if R2 is zero. Set the expression in the second row and second column equal to zero and solve. If or the vectors will be linearly dependent.

  7. Example: Determine a relation between a, b, c, and d so the matrix is singular. Again we want to row reduce but the sequence is not quite the Gauss Jordan method since we do not know the values for the entries. cR1 aR2 -R1+R2 This system will have an independent variable as long as the second row is all zero. Set the entry in the second row and second column equal to zero. From this we have characterized all singular matrices. A matrix of the form is singular if and only if and nonsingular if and only if

More Related