1 / 84

Variations

Variations. Dimensionality reduction. Computer Vision: Stages. Image formation Low-level Single image processing Multiple views Mid-level Grouping information Segmentation High-level Estimation, Recognition Classification. Why Model variations.

svein
Télécharger la présentation

Variations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Variations

  2. Dimensionality reduction Computer Vision: Stages • Image formation • Low-level • Single image processing • Multiple views • Mid-level • Grouping information • Segmentation • High-level • Estimation, • Recognition • Classification

  3. Why Model variations Some objects have similar basic form but some variety in the contour shape as and perhaps also pixel values

  4. Segmentation using snakes (from segmentation) Modeling variations (PCA) Eigen faces and active shape models Combining shape and pixel values ( Active Appearance models) Today

  5. Deformable contours a.k.a. active contours, snakes Given: initial contour (model) near desired object (Single frame) [Snakes: Active contour models, Kass, Witkin, & Terzopoulos, ICCV1987] Fig: Y. Boykov

  6. Deformable contours a.k.a. active contours, snakes Given: initial contour (model) near desired object Goal: evolve the contour to fit exact object boundary (Single frame) [Snakes: Active contour models, Kass, Witkin, & Terzopoulos, ICCV1987] Fig: Y. Boykov

  7. Deformable contours: intuition Image from http://www.healthline.com/blogs/exercise_fitness/uploaded_images/HandBand2-795868.JPG Figure from Shapiro & Stockman

  8. initial final intermediate Deformable contours a.k.a. active contours, snakes • Initialize near contour of interest • Iteratively refine: elastic band is adjusted so as to • be near image positions with high gradients, and • satisfy shape “preferences” or contour priors Fig: Y. Boykov

  9. initial final Deformable contours a.k.a. active contours, snakes Like generalized Hough transform, useful for shape fitting; but intermediate Hough Fixed model shape Single voting pass can detect multiple instances Snakes Prior on shape types, but shape iteratively adjusted (deforms) Requires initialization nearby One optimization “pass” to fit a single contour

  10. initial intermediate final Deformable contours a.k.a. active contours, snakes • How is the current contour adjusted to find the new contour at each iteration? • Define a cost function (“energy” function) that says how good a possible configuration is. • Seek next configuration that minimizes that cost function. What are examples of problems with energy functions that we have seen previously?

  11. Snakes energy function The total energy (cost) of the current snake is defined as: Internal energy: encourage prior shape preferences: e.g., smoothness, elasticity, particular known shape. External energy (“image” energy): encourage contour to fit on places where image structures exist, e.g., edges. A good fit between the current deformable contour and the target shape in the image will yield a low value for this cost function.

  12. Parametric curve representation (continuous case) Fig from Y. Boykov

  13. Parametric curve representation(discrete form) • Represent the curve with a set of n points

  14. External energy: intuition • Measure how well the curve matches the image data • “Attract” the curve toward different image features • Edges, lines, etc.

  15. External image energy How do edges affect “snap” of rubber band? Think of external energy from image as gravitational pull towards areas of high contrast Magnitude of gradient - (Magnitude of gradient)

  16. External image energy • Image I(x,y) • Gradient images and • External energy at a point v(s) on the curve is • External energy for the whole curve:

  17. Internal energy: intuition A priori, we want to favor smooth shapes, contours with low curvature, contours similar to a known shape, etc. to balance what is actually observed (i.e., in the gradient image). http://www3.imperial.ac.uk/pls/portallive/docs/1/52679.JPG

  18. Internal energy For a continuous curve, a common internal energy term is the “bending energy”. At some point v(s) on the curve, this is: The more the curve bends  the larger this energy value is. The weights α and β dictate how much influence each component has. Elasticity, Tension Stiffness, Curvature Internal energy for whole curve:

  19. Dealing with missing data • The smoothness constraint can deal with missing data: [Figure from Kass et al. 1987]

  20. Total energy(continuous form) // bending energy // total edge strength under curve

  21. Discrete energy function:external term • If the curve is represented by n points Discrete image gradients

  22. Discrete energy function:internal term • Curve is represented by n points Elasticity, Tension Stiffness Curvature

  23. Penalizing elasticity • Current elastic energy definition uses a discrete estimate of the derivative, and can be re-written as: Possible problem with this definition? This encourages a closed curve to shrink to a cluster.

  24. Penalizing elasticity • To stop the curve from shrinking to a cluster of points, we can adjust the energy function to be: • This encourages chains of equally spaced points. Average distance between pairs of points – updated at each iteration

  25. large small medium weight controls the penalty for internal elasticity Function of the weights Fig from Y. Boykov

  26. Optional: specify shape prior • If object is some smooth variation on a known shape, we can use a term that will penalize deviation from that shape (more about this later): where are the points of the known shape. Fig from Y. Boykov

  27. Summary: elastic snake • A simple elastic snake is defined by • A set of n points, • An internal elastic energy term • An external edge based energy term • To use this to locate the outline of an object • Initialize in the vicinity of the object • Modify the points to minimize the total energy How should the weights in the energy function be chosen?

  28. For each point, search window around it and move to where energy function is minimal • Typical window size, e.g., 5 x 5 pixels • Stop when predefined number of points have not changed in last iteration, or after max number of iterations • Note • Convergence not guaranteed • Need decent initialization Energy minimization: greedy

  29. Deformable contours Tracking Heart Ventricles (multiple frames)

  30. Shape How to describe the shape of the human face?

  31. Face database

  32. Objective Formulation • Millions of pixels • Transform into a few parameterse.g. Man / woman, fat / skinny etc.

  33. Key Idea Images are points in a high dimensional space Images in the possible set are highly correlated. So, compress them to a low-dimensional subspace that captures key appearance characteristics of the visual DOFs. Today we will use PCA

  34. Dimensionality Reduction The set of faces is a “subspace” of the set of images • Suppose it is K dimensional • We can find the best subspace using PCA (see later) • This is like fitting a “hyper-plane” to the set of faces Any face is spanned by basis vectors:

  35. Eigenfaces: the idea Think of a face as being a weighted combination of some “component” or “basis” faces. These basis faces are called eigenfaces -8029 2900 1751 1445 4238 6193 …

  36. Eigenfaces: representing faces The basis faces can be differently weighted to represent any face -8029 -1183 2900 -2088 1751 -4336 1445 -669 4238 -4221 6193 10549

  37. Learning the basis images Learn a set of basis faces which best represent the differences between the examples Store each face as a set of weights for those basis faces …

  38. Eigenfaces Eigenfaces look somewhat like generic faces.

  39. recognition & reconstruction Store and reconstruct a face from a set of weights Recognise a new picture of a familiar face Representation Synthesis

  40. Learning Variations Use Principle Components Analysis (PCA) Need to understand • What is an eigenvector • What is covariance

  41. Principal Component analysis A sample of nobservations in the 2-D space Goal: to account for the variation in a sample in as few variables as possible, to some accuracy

  42. Subspaces Imagine that our face is simply a (high dimensional) vector of pixels We can think more easily about 2d vectors Here we have data in two dimensions But we only really need one dimension to represent it

  43. Finding Subspaces Suppose we take a line through the space And then take the projection of each point onto that line This could represent our data in “one” dimension

  44. Finding Subspaces Some lines will represent the data in this way well, some badly This is because the projection onto some lines separates the data well, while on others result in bad separation

  45. Finding Subspaces Rather than a line we can perform roughly the same trick with a vector Scale the vector to obtain any point on the line

  46. Eigenvectors Aneigenvectorof a matrix A is a vector such that: Where is a matrix, is a scalar (called the eigenvalue)

  47. Example one eigenvector of A is so for this eigenvector of this matrix the eigenvalue is 4 Matlab: [eigvecs, eigVals] = eigs(C);

  48. Facts about Eigenvectors • The eigenvectors of a matrix are special vectors (for a given matrix) that are only scaled bythematrix • Different matrices have different eigenvectors • Only square (but not all) matrices have eigenvectors • AnN xNmatrix has at mostNdistinct eigenvectors • All the distinct eigenvectors of a matrix are orthogonal (ie perpendicular)

  49. How to separate the 2D points?

  50. Covariance The covariance of two variables is: The diagonal elements are the variances e.g. Var(x1) For data that have been centred around the mean

More Related