190 likes | 342 Vues
This content provides an in-depth exploration of n-dimensional vectors, focusing on their mathematical definitions and operations. Key topics include vector normalization to achieve unit length, the inner (dot) product for calculating magnitudes, and geometric interpretations. The distinctions between orthogonal and orthonormal vectors, linear combinations, space spanning, linear dependence, and independence are discussed. Additionally, it covers the concept of vector bases and transformations to orthogonal bases through the Gram-Schmidt process. Suitable for Computer Vision enthusiasts and students of Dr. George Bebis's CS485/685 course.
E N D
Vectors CS485/685 Computer Vision Dr. George Bebis
n-dimensional Vector • An n-dimensional vector v is denoted as follows: • The transpose vTis denoted as follows:
Vector Normalization • Vector normalization unit length vector Example:
Inner (or dot) product • Given vT= (x1, x2, . . . , xn) and wT= (y1, y2, . . . , yn), their dot product defined as follows: (scalar) or
Defining magnitude using dot product • Magnitude definition: Dot product definition: Therefore:
Geometric definition of dot product • θcorresponds to the smaller angle between u and v
Geometric definition of dot product (cont’d) The sign of u.v depends on cos(θ)
w v Orientation: Magnitude: Vector (cross) Product u The cross product is a VECTOR!
w v Vector Product Computation u
k Orthogonal/Orthonormal vectors • A set of vectors x1, x2, . . . , xnis orthogonal if • A set of vectors x1, x2, . . . , xnis orthonormalif
Linear combinations of vectors • A vector v is a linear combination of the vectors v1, ..., vk: where c1, ..., ck are scalars • Example: any vector in R3 can be expressed as a linear combinations of the unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)
w Space spanning • A set of vectors S=(v1, v2, . . . , vk ) span some space W if every vector in W can be written as a linear combination of the vectors in S • Example: the vectors i, j, and k span R3
Linear dependence • A set of vectors v1, ..., vkare linearly dependent if at least one of them is a linear combination of the others. (i.e., vj does not appear at the right side)
Linear independence • A set of vectors v1, ..., vkis linearly independent if implies Example:
Vector basis • A set of vectors (v1, ..., vk) is said to be a basis for a vector space W if (1) (v1, ..., vk)are linearly independent (2) (v1, ..., vk)span W • Standard bases: R2 R3 Rn
k Orthogonal Basis • A basis with orthogonal basis vectors. • Any set of basis vectors (x1, x2, . . . , xn) can be transformed to an orthogonal basis (o1, o2, . . . , on) using the Gram-Schmidtorthogonalization.
Orthonormal Basis • A basis with orthogonal basis vectors.
Uniqueness of Vector Expansion • Suppose v1, v2, . . . , vn represents a base in W, then any v єW has a unique vector expansion in this base: • The vector expansion provides a meaning for writing a vector as a “column of numbers”. Note: to interpret v, we need to know what basis was used for the expansion !!
Computing Vector Expansion (1) Assuming the basis vectors are orthogonal, to compute xi, take the inner product of viand v: (2) The coefficients of the expansion can be computed as follows: