1 / 69

Computing Motion from Images

Computing Motion from Images. Chapter 9 of S&S plus otherwork. General topics. Low level change detection Region tracking or matching over time Interpretation of motion MPEG compression Interpretation of scene changes in video Understanding human activites.

zoe
Télécharger la présentation

Computing Motion from Images

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computing Motion from Images Chapter 9 of S&S plus otherwork. MSU Fall 2013

  2. General topics • Low level change detection • Region tracking or matching over time • Interpretation of motion • MPEG compression • Interpretation of scene changes in video • Understanding human activites MSU Fall 2013

  3. Motion important to human vision MSU Fall 2013

  4. What’s moving: different cases MSU Fall 2013

  5. Image subtraction Simple method to remove unchanging background from moving regions. MSU Fall 2013

  6. Change detection for surveillance MSU Fall 2013

  7. Change detection by image subtraction MSU Fall 2013

  8. Closing = dilation+erosion http://homepages.inf.ed.ac.uk/rbf/HIPR2/close.htm MSU Fall 2013

  9. What to do with regions of change? • Discard small regions • Discard regions of non interesting features • Keep track of regions with interesting features • Track in future frames from motion plus component features MSU Fall 2013

  10. Some effects of camera motion that can cause problems MSU Fall 2013

  11. Motion field MSU Fall 2013

  12. FOE and FOC Will return to use the FOE or FOC or detection of panning to determine what the camera is doing in video tapes. MSU Fall 2013

  13. Gaming using a camera to recognize the player’s motion Decathlete game MSU Fall 2013

  14. Decathlete game Cheap camera replaces usual mouse for input Running speed and jumping of the avatar is controlled by detected motion of the player’s hands. MSU Fall 2013

  15. Motion detection input device Jumping (hands) Running (hands) MSU Fall 2013

  16. Motion analysis controls hurdling event (console) • Top left shows video frame of player • Middle left shows motion vectors from multiple frames • Center shows jumping patterns MSU Fall 2013

  17. Related work • Motion sensed by crude cameras • Person dances/gestures in space • System maps movement into music • Creative environment? • Good exercise room? MSU Fall 2013

  18. Computing motion vectors from corresponding “points” High energy neighborhoods are used to define points for matching MSU Fall 2013

  19. Match points between frames Such large motions are unusual. Most systems track small motions. MSU Fall 2013

  20. Requirements for interest points Match small neighborhood to small neighborhood. The previous “scene” contains several highly textured neighborhoods. MSU Fall 2013

  21. Interest = minimum directional variance Used by Hans Moravec in his robot stereo vision system. Interest points were used for stereo matching. MSU Fall 2013

  22. Detecting interest points in I1 MSU Fall 2013

  23. Match points from I1 in I2 MSU Fall 2013

  24. Search for best match of point P1 in nearby window of I2 For both motion and stereo, we have some constraints on where to search for a matching interest point. MSU Fall 2013

  25. Motion vectors clustered to show 3 coherent regions All motion vectors are clustered into 3 groups of similar vectors showing motion of 3 independent objects. (Dina Eldin) Motion coherence: points of same object tend to move in the same way MSU Fall 2013

  26. Two frames of aerial imagery Video frame N and N+1 shows slight movement: most pixels are same, just in different locations. MSU Fall 2013

  27. Can code frame N+d with displacments relative to frame N • for each 16 x 16 block in the 2nd image • find a closely matching block in the 1st image • replace the 16x16 intensities by the location in the 1st image (dX, dY) • 256 bytes replaced by 2 bytes! • (If blocks differ too much, encode the differences to be added.) MSU Fall 2013

  28. Frame approximation Left is original video frame N+1. Right is set of best image blocks taken from frame N. (Work of Dina Eldin) MSU Fall 2013

  29. Best matching blocks between video frames N+1 to N (motion vectors) The bulk of the vectors show the true motion of the airplane taking the pictures. The long vectors are incorrect motion vectors, but they do work well for compression of image I2! Best matches from 2nd to first image shown as vectors overlaid on the 2nd image. (Work by Dina Eldin.) MSU Fall 2013

  30. Motion coherence provides redundancy for compression MPEG “motion compensation” represents motion of 16x16 pixels blocks, NOT objects MSU Fall 2013

  31. MPEG represents blocks that move by the motion vector MSU Fall 2013

  32. MPEG has ‘I’, ‘P’, and ‘B’ frames MSU Fall 2013

  33. Computing Image Flow MSU Fall 2013

  34. MSU Fall 2013

  35. CCD 3D motion vector 2D optical flow vector Motion Field & Optical Flow Field • Motion Field = Real world 3D motion • Optical Flow Field = Projection of the motion field onto the 2d image Slides from Lihi Zelnik-Manor

  36. Assumptions MSU Fall 2013

  37. The screen is stationary yet displays motion Fixed sphere. Changing light source. Non-rigid texture motion Homogeneous objects generate zero optical flow. When does it break? Slides from Lihi Zelnik-Manor

  38. Image flow equation 1 of 2 MSU Fall 2013

  39. Image flow equation 2 of 2 MSU Fall 2013

  40. Estimating Optical Flow • Assume the image intensity is constant Time = t Time = t+dt MSU Fall 2013 Slides from Lihi Zelnik-Manor

  41. Brightness Constancy Equation First order Taylor Expansion Simplify notations: Divide by dt and denote: Problem I: One equation, two unknowns MSU Fall 2013 Slides from Lihi Zelnik-Manor

  42. Time t Time t+dt Time t+dt Problem II: “The Aperture Problem” • For points on a line of fixed intensity we can only recover the normal flow ? Where did the yellow point move to? We need additional constraints MSU Fall 2013 Slides from Lihi Zelnik-Manor

  43. Use Local Information Sometimes enlarging the aperture can help MSU Fall 2013 Slides from Lihi Zelnik-Manor

  44. Local smoothnessLucas Kanade (1984) Assume constant (u,v) in small neighborhood MSU Fall 2013 Slides from Lihi Zelnik-Manor

  45. Goal: Minimize 2x2 2x1 2x1 Lucas Kanade (1984) Method: Least-Squares MSU Fall 2013 Slides from Lihi Zelnik-Manor

  46. Lucas-Kanade Solution We want this matrix to be invertible. i.e., no zero eigenvalues MSU Fall 2013 Slides from Lihi Zelnik-Manor

  47. Correlation based methods Regularization based methods Use multi-scale estimation Break-downs • Brightness constancy is not satisfied • A point does not move like its neighbors • what is the ideal window size? • The motion is not small (Taylor expansion doesn’t hold) MSU Fall 2013 Slides from Lihi Zelnik-Manor

  48. u=1.25 pixels u=2.5 pixels u=5 pixels u=10 pixels image It-1 image I Multi-Scale Flow Estimation image It image It+1 Gaussian pyramid of image It Gaussian pyramid of image It+1 MSU Fall 2013 Slides from Lihi Zelnik-Manor

  49. Tracking several objects Use assumptions of physics to compute multiple smooth paths. (work of Sethi and R. Jain) MSU Fall 2013

  50. MSU Fall 2013

More Related