1 / 73

Foundations of Computer Vision Lecture 14

Foundations of Computer Vision Lecture 14. Roger S. Gaborski. 1. RECALL: Orange F lower Example. Orange F lower Example. 10 pixel sample from orange flower >> mr = mean(data(:,1)) mr = 1 > > mg = mean(data(:,2)) mg = 0.5325 >> mb = mean(data(:,3) ) mb = 0.

marsha
Télécharger la présentation

Foundations of Computer Vision Lecture 14

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Foundations of Computer VisionLecture 14 Roger S. Gaborski 1

  2. RECALL:Orange Flower Example Roger S. Gaborski

  3. Orange Flower Example • 10 pixel sample from orange flower • >> mr = mean(data(:,1)) • mr = 1 • >> mg = mean(data(:,2)) • mg = 0.5325 • >> mb = mean(data(:,3)) • mb= 0 Roger S. Gaborski

  4. distMeasure = sqrt((I(:,:,1)-mr).^2 + (I(:,:,2)-mg).^2 + (I(:,:,3)-mb).^2); >> figure, imshow(distMeasure, []) Roger S. Gaborski

  5. Flower should be pixel locations with small distance measure Threshold distance measure What threshold value??? figure, hist(distMeasure(:),100) Roger S. Gaborski

  6. >> Iflower = distMeasure<.45; >> figure, imshow(Iflower) Roger S. Gaborski

  7. >> Fred = I(:,:,1).*Iflower; >> Fgrn = I(:,:,2) .*Iflower; >> Fblu = I(:,:,3) .*Iflower; >> F(:,:,1) = Fred; >> F(:,:,2) = Fgrn; >> F(:,:,3) = Fblu; >> figure, imshow(F) Roger S. Gaborski

  8. Extract and Analyze Brandy im=imread('IMGP1715.JPG'); >> imSm = imresize(im, .25); >> figure,imshow(imSm) Roger S. Gaborski 8

  9. Approaches Gray scale thresholding Roger S. Gaborski 9

  10. 1.Gray scale thresholding • Approach – First convert to gray scale (losing color • information), then threshold • >> imSmGray = rgb2gray(imSm); • >> imSmGray = im2double(imSmGray); • >>figure, imshow(imSmGray) • >>figure, imshow(im2bw(imSmGray,graythresh(imSmGray))) Roger S. Gaborski 10

  11. Clearly, unsuccessful WHY DID IT FAIL?? The intensity value of pixels of Brandy is very close to intensity values of pixels of the background, which makes it hard to segment based on intensity distribution. Roger S. Gaborski 11

  12. Grayscale Histogram >> max(imSmGray(:)) ans = 0.9804 >> min(imSmGray(:)) ans = 0.0510 >> figure, imhist(imSmGray) No clear separation line of dog and background Roger S. Gaborski 12

  13. Approaches Gray scale thresholding Detect edges and then segment Roger S. Gaborski 13

  14. 2. Edge Detection: Sobel Roger S. Gaborski 14

  15. 2. Edge Detection: Laplacian of Gaussian Roger S. Gaborski 15

  16. 2. Edge Detection: Canny Roger S. Gaborski 16

  17. Reason of the failures • Gray scale thresholding and edge detection: • Both these 2 algorithms work in gray scale space, only taking into account of intensity values of the pixel. However, the intensity value of the dog and the grass is very similar to each other, which makes the noise very hard to eliminate. The edge detection algorithms also fail in this case. • They ignore the most informative component: distinct colors of the brown dog and the green grass

  18. Approaches Gray scale thresholding Detect edges and then segment Color segmentation Color spaces : RGB Euclidean distance Mahananobis distance Roger S. Gaborski 18

  19. 3. Color Segmentation: Euclidean Distance Manually select pixels Roger S. Gaborski 19

  20. 3. Color Segmentation: Mahalanobis Distance Brandy Noise from brown earth Manually select pixels Roger S. Gaborski 20

  21. Original Brandy picture Have very similar color with brandy Roger S. Gaborski 21

  22. Discussion None of the 3 planes will work for the segmentation of Brandy and the grass. However, by combining B, G, and B planes together, we can roughly segment Brandy from the grass by Euclidean distance and achieve desirable segmentation by Mahalanobis distance (taking into account of correlations between different color planes). Roger S. Gaborski 22

  23. Individual Color Planes >> figure, subplot(2,3,1),imshow(imSm(:,:,1)),title('Red') >> subplot(2,3,2),imshow(imSm(:,:,2)),title('Green') >> subplot(2,3,3),imshow(imSm(:,:,3)),title('Blue') >> subplot(2,3,4),imshow(im2bw(imSm(:,:,1),graythresh(imSm(:,:,1)))) title('Red Threshold') >> subplot(2,3,5),imshow(im2bw(imSm(:,:,2),graythresh(imSm(:,:,2)))) title('Green Threshold') >> subplot(2,3,6),imshow(im2bw(imSm(:,:,3),graythresh(imSm(:,:,3)))) title('Blue Threshold') Roger S. Gaborski 23

  24. HSV Color Space Seems doesn’t work when combining HSV together >> imH = rgb2hsv(imSm); >> figure, imshow(imH) Roger S. Gaborski 24

  25. Distinct hues of brown color (brandy) and green color (grass) Perfect for separating the dog from the background Hard to distinguish >> figure, subplot(1,3,1),imshow(imH(:,:,1)),title('Hue') >> subplot(1,3,2),imshow(imH(:,:,2)),title('Saturation') >> subplot(1,3,3),imshow(imH(:,:,2)),title('Value') Roger S. Gaborski 25

  26. Imhist(imH(:,:,1)) Grass distribution Dog distribution Separating line Histogram distribution in Hue space Roger S. Gaborski 26

  27. Dog pixels gray level value = 0 Still, very similar hue with Brandy >> level = graythresh(imH(:,:,1)) level = 0.1725 (automatic threshold) >> figure, imshow(imH(:,:,1)>level) Roger S. Gaborski 27

  28. Summary Color is the ideal descriptor in segmenting Brandy from the grass (distinct colors) Edge detection algorithms fail when the intensity values of adjacent pixels are very similar with each other We will continue with color segmentation and morphological processing in next lecture Follow up assignments on region growing and color segmentation will be posted on course website shortly. You will be informed when they are posted.

  29. Brandy

  30. RainGirl Color Segmentation

  31. Approaches • We look at two approaches for color segmentation: • Region segmentation using a distance measurements • Region growing using seeds and a distance measurement

  32. Distance Measurement

  33. Distance Map

  34. Distance Threshold <.15

  35. Distance Threshold <.25

  36. L*a*b Color Space • 'L*' luminosity or brightness layer, • 'a*' chromaticity layer indicating where color falls along the red-green axis • 'b*' chromaticity layer indicating where the color falls along the blue-yellow axis.

  37. Distance Measure in L*a*b Space • Only use a and b planes • Manually sample image • Estimate mean of a and b values • Calculate distance as before

  38. Distance Map for L*a*b Space

  39. distMeasure < 40

  40. distMeasure < 30

  41. distMeasure < 20

  42. distMeasure < 10

  43. K-means Clustering • MATLAB: IDX = kmeans(X,k) partitions the points in the n-by-p data matrix X into k clusters. • How many clusters? k • Distance measure: Euclidean

  44. Very Simple Example • Consider 1 dimensional data (algorithm works with n dimensional data Assume k = 2 Assume cluster centers, randomly pick 2 values

  45. Very Simple Example • Measure distance between centers and remaining points Assign points to closer center Recalculate centers based on membership 1 = 1 (2+3+5+6+7+8+9)/7 = 5.7143

  46. Very Simple Example 1.000 5.7143 Assign points to closer new center Recalculate centers based on membership (1+2+3)/3 = 2.0 (5+6+7+8+9)/5 = 7.0

  47. Very Simple Example No points reassigned, done Z1 = 2.0 Z2 = 7.0

  48. K-means Clustering • K-means algorithm has self organizing properties • n-dimensional vectors may be considered points in a n-dimensional Euclidean space By a Euclidean space we mean a Rn space with a definition of distance between vectors x and y as: d(x,y) = sqrt{ (x1-y1)2+(x2-y2)2+…+(xn-yn)2} Euclidean norm or length of x ||x|| = sqrt{ x12 + x22 +…+xn2 } • K-means is one of many techniques that uses the notion of clustering by minimum distance Why does using minimum distance make sense?

  49. K-means Clustering • Two vectors that represent points in n space that are geometrically close may in some sense belong together • Notation: • Norm or length of vector x: ||x||=  xi2 • Distance between two vectors: ||x-z||= ( xi-zi)2 1/2 i 1/2 i

  50. K-Means Algorithm • We measure how close vectors are • We establish cluster points and partition vectors into these clusters such that the distance between a vector and the cluster it is assigned to is minimum with regards to the other points • With k-means you need to know the number of cluster centers.

More Related