1 / 101

Introduction to Computer Vision

Introduction to Computer Vision. Lecture 14A Dr. Roger S. Gaborski. Motion Detection. Temporal Differencing Take the difference between two temporally adjacent frames. The difference is the moving pixels (almost). The static background results in zeros.

jayden
Télécharger la présentation

Introduction to Computer Vision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Computer Vision Lecture 14A Dr. Roger S. Gaborski

  2. Motion Detection • Temporal Differencing • Take the difference between two temporally adjacent frames. The difference is the moving pixels (almost). The static background results in zeros. • Can adapt to changing lighting conditions because the difference frames are only 1/30 of a second apart (typical video 30 frames per second – 30 fps) Roger S. Gaborski

  3. Motion Detection • Temporal Differencing Issues • Not all the relevant pixels extracted • Background pixels extracted. Roger S. Gaborski

  4. Motion Detection Frame at time t Frame at time t+1 Frame Difference Red Block appears as two separate objects Roger S. Gaborski

  5. Sidewalk Scene Roger S. Gaborski

  6. Motion DetectionDifference Method Sidewalk 12_5 Roger S. Gaborski

  7. Processing Video in Matlab %C:\Program Files\MATLAB\R2008b\toolbox\OCR\BackgroundAnalysis02_02_2010 bkg = 20; %frames of video to be processed fname = 'Office1.avi'; vidObj = mmreader(fname); %Play video implay('Office1.avi'); nFrames = vidObj.NumberOfFrames; rw = vidObj.Height; cl = vidObj.Width; numFrames = 1000; CODE TO PROCESS FRAMES HERE Roger S. Gaborski

  8. Writing Frames to Directory imwrite(objBox,['.\video\','LabScene','.',num2str(i),'.jpg'],'jpg'); imwrite(objBoxVideo,['.\video\','LabSceneColor','.',num2str(i),'.jpg'],'jpg'); Roger S. Gaborski

  9. MATLAB vidObj = mmreader(fname); %motionDet.m fname = 'sidewalk11_23indeo.avi'; a = aviread(fname); frameInfo = aviinfo(fname); totalFrames = frameInfo.NumFrames for i = 1:50 %for i = 1:totalFrames-1 currentFrameDiff = abs(im2double(a(1,i+1).cdata)-im2double(a(1,i).cdata)); movDiff(i) = im2frame (currentFrameDiff); end %MATLAB Movie file figure, movie(movDiff) % FOR AVI MOVIE %movie2avi(movDiff,'sidewalk12_05_07.avi','compression', 'none'); Roger S. Gaborski

  10. Processing .avi files %readWriteAviFiles fname = 'CarsTarget2.avi'; % extracting the frame information. %frameInfo = aviinfo( strcat( pathname, fname )); frameInfo = aviinfo( fname ); disp( frameInfo ); for cnt = 1:20 mov1=aviread(fname,cnt); frame1 = mov1(1,1).cdata; %uint8 image1= im2double(frame1); figure,imshow(image1); %WRITE INDIVIDUAL FRAMES TO DIRECTORY imwrite(image1,['.\video\','CarVideo','.',num2str(cnt),'.jpg'],'jpg'); end Roger S. Gaborski

  11. Create Video from Individual Frames • VirtualDub Roger S. Gaborski

  12. Motion Detection • Background Modeling • Model background without moving objects • Represent each pixel in the frame with a 3D Gaussian – mean red, green, blue and covariance matrix • For each pixel, collect n pixel triplets. • Use triplets to estimate mean and covariance matrix • Process future frames by determining the probability of each pixel in the new frame • Threshold the probability, p(r,c)>thres is a foreground pixel (moving object) • Compare pixel values in current frame and estimate if pixel is represented by background distribution or more likely from a different distribution (therefore new object not in background) Roger S. Gaborski

  13. Updating Gaussian Distributions • Small changes in the environment will result in thresholding errors • Adapt the Gaussian models by calculating a weighted average • Estimate means and covariance matrix from initial frames • Update distributions using pixels identified as background – distributions will adjust for slight changes in lighting conditions Roger S. Gaborski

  14. Instead of using estimated covariance matrix use the identity matrix • How does this change affect performance?? Roger S. Gaborski

  15. Pixel ModelingStationary Camera Sidewalk Threshold Roger S. Gaborski

  16. Overpass Object Tracking Overpass Roger S. Gaborski

  17. Face Tracking Example – All Objects Roger S. Gaborski

  18. Face Tracking Example – Faces Only Roger S. Gaborski

  19. Non-stationary Camera • Example: A camera panning a scene • One approach is to register the adjacent frames • Find key points in adjacent frames • Determine offset • Adjust images so that they overlap • Take difference Roger S. Gaborski

  20. Panning a Building Complex Pan C Roger S. Gaborski

  21. Overall approach Roger S. Gaborski

  22. Points of Interest Pan C Interest Points Roger S. Gaborski

  23. Correspondence Pan C Correspondence Roger S. Gaborski

  24. Difference Pan C subtract Roger S. Gaborski

  25. Video Sequence-Detection and Tracking- Object Tracking Silver Car Roger S. Gaborski

  26. Non-Stationary CameraVideo Sequence of Harry Harry the Dog Roger S. Gaborski

  27. Non-Stationary CameraModel-based Tracking • Create a model of an object of interest • In each frame search for the model Roger S. Gaborski

  28. Non-Stationary CameraVideo Sequence of Harry Roger S. Gaborski

  29. Edge Magnitude and Orientation

  30. Matlab Script to Calculate Magnitude and Orientation • Read in image (in this example, Sign4.jpg from webpage) , convert to double in range [0,1] • Convert to grayscale • Use the following filters to calculate the edges: fx = [ -1 -1 -1; 0 0 0; 1 1 1] fy = [ -1 0 1; -1 0 1; -1 0 1] • Use imfilter to find edges in x and y directions (Ifx and Ify) • Use the subplot filter to display the two images • One row, two columns Roger S. Gaborski

  31. Gray scale Image Roger S. Gaborski

  32. EDGE IMAGES Ifx and Ify Roger S. Gaborski

  33. Find the magnitude of the edges and display the edge image (this is NOT a binary image) • figure, imshow(mag, []), title('Magnitude of Edges in Image') • Display a histogram of the magnitudes Roger S. Gaborski

  34. Roger S. Gaborski

  35. Roger S. Gaborski

  36. Find the orientation in degrees (NOT radians) of all the edges • Display a histogram of the orientation of the edges Roger S. Gaborski

  37. Roger S. Gaborski

  38. Find the magnitude value of the strongest edge, assign to variable max_mag • %Find edges that are 10%, 25 and 50% greater than the max_mag • ed10 = ??? • ed25 = ??? • ed50 = ??? • DISPLAY THESE EDGES • figure, imshow(ed10, []), title('Edges that are atleast 10% or greater of maximum edge strength') • figure, imshow(ed25, []), title('Edges that are atleast 25% or greater of maximum edge strength') • figure, imshow(ed50, []), title('Edges that are atleast 50% or greater of maximum edge strength') Roger S. Gaborski

  39. Roger S. Gaborski

  40. Roger S. Gaborski

  41. Roger S. Gaborski

  42. Display Orientated Edges using Color Map • figure, imshow(???, []), title('Orientation of edges that are atleast 10% or greater of maximum edge strength') • colormap(jet) • figure, imshow(???, []), title('Orientation of edges that are atleast 25% or greater of maximum edge strength') • colormap(jet) • figure, imshow(???, []), title('Orientation of edges that are atleast 50% or greater of maximum edge strength') • colormap(jet) Roger S. Gaborski

  43. EMAIL THIS RESULTS WITH SCRIPT Roger S. Gaborski

  44. EMAIL THIS RESULTS WITH SCRIPT Roger S. Gaborski

  45. EMAIL THIS RESULTS WITH SCRIPT Roger S. Gaborski

  46. Extra Credit • Write a script to perform the operations described in this presentation and discussed in class • Use good MATLAB programming practices • No loops, etc. Zero credit if loops are used • Not counting print statements, about 15 lines of code • Use PinkHotel.jpg image from course webpage • Email script, and last three images to course account by 8pm today, Tuesday • You must work independently – students not working independently will receive a zero for Exam 1 • 10 points added to exam 1 score for correct programming techniques and correct answers • No partial credit Roger S. Gaborski C:\Program Files\MATLAB\R2008b\toolbox\CompVision2009\EdgeOrientationDisplay

  47. Pattern Recognition • After we make a number of measurements (color, texture, etc.) on our image we would like to classify the image into relevant regions. • In an outdoor scene we may be interested in finding people, cars, roads, signs, etc. • We may want to classify an image as either an outdoor scene or an indoor scene Roger S. Gaborski

  48. General Pattern Classifier Feature Extractor Algorithm Color Image Feature Vector Classifier Algorithm Roger S. Gaborski

  49. Classifiers for Gaussian Data Sets

  50. Training and Testing Datasets • Training data is used to train the classifier, or to find the parameters of a classifier. For a Gaussian classifier we need to estimate the mean and variance of the data for each class • Testing data is a separate set of data that is not used during training, but is used to test the classifier Roger S. Gaborski

More Related