1 / 50

3.8 Inner Product Spaces

3.8 Inner Product Spaces. Euclidean n -space:

lous
Télécharger la présentation

3.8 Inner Product Spaces

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 3.8 Inner Product Spaces • Euclidean n-space: Rn was defined to be the set of all ordered n-tuples of real numbers. When Rn is combined with the standard operations of vector addition, scalar multiplication, vector length, and the dot product, the resulting vector space is called Euclidean n-space. The dot product of two vectors is defined to be The definitions of the vector length and the dot product are needed to provide the metric concept for the vector space.

  2. Axioms of inner product: Let u, v, and w be vectors in a vector space V, and let c be any scalar. An inner product on V is a function that associates a real number <u, v> with each pair of vectors u and v and satisfies the following axioms. (1) (2) (3) (4)and if and only if

  3. Note: A vector space V with an inner product is called an inner product space. Vector space: Inner product space: • Note:

  4. Ex:(A different inner product for Rn) Show that the function defines an inner product on R2, where and . Sol:

  5. Ex: (A function that is not an inner product) Show that the following function is not an inner product on R3. Sol: Let Axiom 4 is not satisfied. Thus this function is not an inner product on R3.

  6. Ex: (A function that is not an inner product) Show that the following function is not an inner product on R3. Sol: Let Axiom 4 is not satisfied. Thus this function is not an inner product on R3.

  7. For a norm, there are many possibilities. • Norm (length) of u: (5) If A and B are two matrices, an inner product can be <A,B>=Tr(A†B), where † is the transpose complex conjugate of the matrix and Tr means the trace. Therefore

  8. For a norm, there are many possibilities. • There is an example in criminal law in which the distinctions between some of these norms has very practical consequences. If you’re caught selling drugs in New York there is a longer sentence if your sale is within 1000 feet of a school. If you are an attorney defending someone accused of this crime, which of the norms would you argue for? The legislators didn’t know linear algebra, so they didn’t specify which norm they intended. The prosecuting attorney argued for norm #1, “as the crow flies.” The defense argued that “crows don’t sell drugs” and humans move along city streets, so norm #2 is more appropriate. • The New York Court of Appeals decided that the Pythagorean norm (#1) is the appropriate one and they rejected the use of the pedestrian norm that the defendant advocated (#2).

  9. Orthogonal: u and v are orthogonal if . Note: If , then v is called a unit vector. (the unit vector in the direction of v) • Distance between u and v: • Anglebetween two nonzero vectors u and v:

  10. Properties of norm: (1) (2) if and only if (3) • Properties of distance: (1) (2) if and only if (3)

  11. Sol: • Ex: (Finding inner product) is an inner product

  12. Thm 3.21: Let u and v be vectors in an inner product spaceV. (1) Cauchy-Schwarz inequality: (2) Triangle inequality: (3) Pythagorean theorem : u and v are orthogonal if and only if

  13. u v • Orthogonal projections in inner product spaces: Let u and v be two vectors in an inner product spaceV, such that . Then the orthogonal projection of u onto v is given by Note: • We can solve for this coefficient by noting that because is orthogonal to a scalar multiple of , it must be orthogonal to itself. Therefore, the consequent fact that the dot product is zero, • giving that

  14. Ex: (Finding an orthogonal projection in R3) Use the Euclidean inner product in R3 to find the orthogonal projection of u=(6, 2, 4) onto v=(1, 2, 0). Sol:

  15. Thm 3.22: (Orthogonal projection and distance) Let v and s be two vectors in an inner product space V, such that . Then

  16. 3.9Orthonormal Bases: Gram-Schmidt Process • Orthogonal: A set S of vectors in an inner product space V is called an orthogonal set if every pair of vectors in the set is orthogonal. • Orthonormal: An orthogonal set in which each vector is a unit vector is called orthonormal.

  17. Ex: (An orthonormal basis for ) In , with the inner product The standard basis is orthonormal. Sol: Then Thus, B is an orthonormal basis for .

  18. Thm 3.23: (Orthogonal sets are linearly independent) If is an orthogonal set of nonzero vectors in an inner product space V, then S is linearly independent. Pf: S is an orthogonal set of nonzero vectors

  19. Ex: (Using orthogonality to test for a basis) Show that the following set is a basis for . Sol: : nonzero vectors

  20. Thm 3.24: (Coordinates relative to an orthonormal basis) If is an orthonormal basis for an inner product space V, then the coordinate representation of a vectorw with respect toB is Pf: is a basis forV (unique representation) is orthonormal

  21. Sol: • Ex: (Representing vectors relative to an orthonormal basis) Find the coordinates of w = (5, -5, 2) relative to the following orthonormal basis for .

  22. Gram-Schmidt orthonormalization process: is a basis for an inner product space V is an orthogonal basis. is an orthonormal basis.

  23. Ex:(Applying the Gram-Schmidt orthonormalization process) Apply the Gram-Schmidt process to the following basis. Sol:

  24. Orthogonal basis Orthonormal basis

  25. Ex: Find an orthonormal basis for the solution space of the homogeneous system of linear equations. Sol:

  26. Thus one basis for the solution space is (orthogonal basis) (orthonormal basis)

  27. (read “ perp”) 3.10 Mathematical Models and Least-Squares Analysis • Orthogonal complement of W: Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector in W. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W.

  28. Thm 3.25: (Properties of orthogonal subspaces) Let W be a subspace of Rn. Then the following properties are true. (1) (2) (3) • Direct sum: Let and be two subspaces of . If each vector can be uniquely written as a sum of a vector from and a vector from , , then is the direct sum of and , and you can write

  29. Thm 3.26: (Projection onto a subspace) If is an orthonormal basis for the subspace Wof V, and for , then

  30. Ex: (Projection onto a subspace) Find the projection of the vector v onto the subspace W. Sol: an orthogonal basis for W an orthonormal basis for W

  31. Fitting by Least-Squares Scientists are often presented with a system that has no solution and they must find an answer that is as close as possible to being an answer. Suppose that we have a coin to use in flipping and this coin has some proportion m of heads to total flips. Because of randomness, we do not find the exact proportion with this sample The vector of experimental data {16, 34, 51} is not in the subspace of solutions.

  32. However, we want to find the m that most nearly works. An orthogonal projection of the data vector into the line subspace gives our best guess. The estimate (m = 7110/12600 ~ 0.56) is a bit high than 1/2 but not much, so probably the penny is fair enough. The line with the slope m= 0.56 is called the line of best fitfor this data. Minimizing the distance between the given vector and the vector used as the left-hand side minimizes the total of these vertical lengths. We say that the line has been obtained through fitting by least-squares.

  33. The linear system with equations has no solution, but we can use orthogonal projection to find a best approximation. The different denominations of U.S. money have different average times in circulation

  34. The method on the Projection into a Subspace says that coefficients b and m so that the linear combination of the columns of Ais as close as possible to the vector are the entries of Some calculation gives an intercept of b = 1.05 and a slope of m = 0.18.

  35. Thm 3.27: (Orthogonal projection and distance) Let S be a subspace of an inner product space V, and . Then for all , ( is the best approximation to v from S)

  36. Pf: By the Pythagorean theorem

  37. Fundamental subspaces of a matrix For a given mn matrix [Amn]:The spaces NS(A) and RS(A) are orthogonal complements of each other within Rn. This means that any vector from NS(A) is orthogonal to any vector from CS(AT), and the vectors in these two spaces span Rn, i.e., .

  38. Thm 3.28: If A is an m×n matrix, then (1) (2) (3) (4)

  39. Ex: (Fundamental subspaces) Find the four fundamental subspaces of the matrix. (reduced row-echelon form) Sol:

  40. Check:

  41. Sol: (reduced row-echelon form) • Ex: Let W is a subspace of R4 and . (a) Find a basis for W (b) Find a basis for the orthogonal complement of W.

  42. Notes: is a basis for W

  43. Least-squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is consistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small.

  44. Least-squares solution: Given a system Ax = bof m linear equations in n unknowns, the least squares problem is to find a vector x in Rn that minimizes with respect to the Euclidean inner product on Rn. Such a vector is called a least-squares solution ofAx = b.

  45. (This is the solution of the normal system associated with Ax = b)

  46. Note: The problem of finding the least-squares solution of is equal to the problem of finding an exact solution of the associated normal system . • Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least-squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is

  47. Ex: (Solving the normal equations) Find the least squares solution of the following system and find the orthogonal projection of b on the column space of A.

  48. Sol: the associated normal system

  49. the least squares solution of Ax = b the orthogonal projection of b on the column space of A

More Related