1 / 19

Vectors

Vectors. CS485/685 Computer Vision Dr. George Bebis. n-dimensional Vector. An n -dimensional vector v is denoted as follows: The transpose v T is denoted as follows:. Vector Normalization. Vector normalization  unit length vector. Example:. Inner (or dot) product.

lois-mullen
Télécharger la présentation

Vectors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Vectors CS485/685 Computer Vision Dr. George Bebis

  2. n-dimensional Vector • An n-dimensional vector v is denoted as follows: • The transpose vTis denoted as follows:

  3. Vector Normalization • Vector normalization  unit length vector Example:

  4. Inner (or dot) product • Given vT= (x1, x2, . . . , xn) and wT= (y1, y2, . . . , yn), their dot product defined as follows: (scalar) or

  5. Defining magnitude using dot product • Magnitude definition: Dot product definition: Therefore:

  6. Geometric definition of dot product • θcorresponds to the smaller angle between u and v

  7. Geometric definition of dot product (cont’d) The sign of u.v depends on cos(θ)

  8. w  v Orientation: Magnitude: Vector (cross) Product u The cross product is a VECTOR!

  9. w  v Vector Product Computation u

  10. k Orthogonal/Orthonormal vectors • A set of vectors x1, x2, . . . , xnis orthogonal if • A set of vectors x1, x2, . . . , xnis orthonormalif

  11. Linear combinations of vectors • A vector v is a linear combination of the vectors v1, ..., vk: where c1, ..., ck are scalars • Example: any vector in R3 can be expressed as a linear combinations of the unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)

  12. w Space spanning • A set of vectors S=(v1, v2, . . . , vk ) span some space W if every vector in W can be written as a linear combination of the vectors in S • Example: the vectors i, j, and k span R3

  13. Linear dependence • A set of vectors v1, ..., vkare linearly dependent if at least one of them is a linear combination of the others. (i.e., vj does not appear at the right side)

  14. Linear independence • A set of vectors v1, ..., vkis linearly independent if implies Example:

  15. Vector basis • A set of vectors (v1, ..., vk) is said to be a basis for a vector space W if (1) (v1, ..., vk)are linearly independent (2) (v1, ..., vk)span W • Standard bases: R2 R3 Rn

  16. k Orthogonal Basis • A basis with orthogonal basis vectors. • Any set of basis vectors (x1, x2, . . . , xn) can be transformed to an orthogonal basis (o1, o2, . . . , on) using the Gram-Schmidtorthogonalization.

  17. Orthonormal Basis • A basis with orthogonal basis vectors.

  18. Uniqueness of Vector Expansion • Suppose v1, v2, . . . , vn represents a base in W, then any v єW has a unique vector expansion in this base: • The vector expansion provides a meaning for writing a vector as a “column of numbers”. Note: to interpret v, we need to know what basis was used for the expansion !!

  19. Computing Vector Expansion (1) Assuming the basis vectors are orthogonal, to compute xi, take the inner product of viand v: (2) The coefficients of the expansion can be computed as follows:

More Related