1 / 69

Appearance-based recognition & detection II

Appearance-based recognition & detection II. Kristen Grauman UT-Austin Tuesday, Nov 11. Last time. Appearance-based recognition : using global appearance descriptions within a window to characterize a class. Classification: basic idea of supervised learning Skin color detection example

anila
Télécharger la présentation

Appearance-based recognition & detection II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Appearance-based recognition & detection II Kristen Grauman UT-Austin Tuesday, Nov 11

  2. Last time • Appearance-based recognition: using global appearance descriptions within a window to characterize a class. • Classification: basic idea of supervised learning • Skin color detection example • Sliding windows: detection via classification • Make a yes/no decision at every window • Face detection example using boosting and rectangular features [Viola-Jones 2001]

  3. Misc notes • Extra disk space • SIFT extraction • http://www.cs.ubc.ca/~lowe/keypoints/

  4. Today • Additional classes well-suited by global appearance representations • Discriminative classifiers • Boosting (last time) • Nearest neighbors • Support vector machines • Application to pedestrian detection • Application to gender classification

  5. Viola-Jones Face Detector: Summary • Train with 5K positives, 350M negatives • Real-time detector using 38 layer cascade • 6061 features in final layer • [Implementation available in OpenCV: http://www.intel.com/technology/computing/opencv/] Train cascade of classifiers with AdaBoost Apply to each subwindow Faces New image Selected features, thresholds, and weights Non-faces K. Grauman, B. Leibe

  6. Viola-Jones Face Detector: Results

  7. Example application Frontal faces detected and then tracked, character names inferred with alignment of script and subtitles. Everingham, M., Sivic, J. and Zisserman, A."Hello! My name is... Buffy" - Automatic naming of characters in TV video,BMVC 2006. http://www.robots.ox.ac.uk/~vgg/research/nface/index.html 7 K. Grauman, B. Leibe

  8. Example application: faces in photos

  9. Other classes that might work with global appearance in a window?

  10. Penguin detection & identification This project uses the Viola-Jones Adaboost face detection algorithm to detect penguin chests, and then matches the pattern of spots to identify a particular penguin. Burghart, Thomas, Barham, and Calic. Automated Visual Recognition of Individual African Penguins , 2004.

  11. Use rectangular features, select good features to distinguish the chest from non-chests with Adaboost Burghart, Thomas, Barham, and Calic. Automated Visual Recognition of Individual African Penguins , 2004.

  12. Attentional cascade Penguin chest detections Burghart, Thomas, Barham, and Calic. Automated Visual Recognition of Individual African Penguins , 2004.

  13. Given a detected chest, try to extract the whole chest for this particular penguin. Burghart, Thomas, Barham, and Calic. Automated Visual Recognition of Individual African Penguins , 2004.

  14. Example detections Burghart, Thomas, Barham, and Calic. Automated Visual Recognition of Individual African Penguins , 2004.

  15. Perform identification by matching the pattern of spots to a database of known penguins. Burghart, Thomas, Barham, and Calic. Automated Visual Recognition of Individual African Penguins , 2004.

  16. Penguin detection & identification Burghart, Thomas, Barham, and Calic. Automated Visual Recognition of Individual African Penguins , 2004.

  17. Discriminative classifiers Neural networks Nearest neighbor 106 examples LeCun, Bottou, Bengio, Haffner 1998 Rowley, Baluja, Kanade 1998 … Shakhnarovich, Viola, Darrell 2003 Berg, Berg, Malik 2005... Conditional Random Fields Support Vector Machines Boosting Guyon, Vapnik Heisele, Serre, Poggio, 2001,… Viola, Jones 2001, Torralba et al. 2004, Opelt et al. 2006,… McCallum, Freitag, Pereira 2000; Kumar, Hebert 2003 … Slide adapted from Antonio Torralba

  18. Today • Additional classes well-suited by global appearance representations • Discriminative classifiers • Boosting (last time) • Nearest neighbors • Support vector machines • Application to pedestrian detection • Application to gender classification

  19. Nearest Neighbor classification • Assign label of nearest training data point to each test data point Black = negative Red = positive Novel test example Closest to a positive example from the training set, so classify it as positive. from Duda et al. Voronoi partitioning of feature space for 2-category 2D data

  20. K-Nearest Neighbors classification • For a new point, find the k closest points from training data • Labels of the k points “vote” to classify k = 5 Black = negative Red = positive If query lands here, the 5 NN consist of 3 negatives and 2 positives, so we classify it as negative. Source: D. Lowe

  21. Example: nearest neighbor classification • We could identify the penguin in the new view based on the distance between its chest spot pattern and all the stored penguins’ patterns. Labeled database of known penguin examples

  22. Example: nearest neighbor classification NN #1 • Similarly, if the video frames we were indexing in the Video Google database had labels, we could classify the query. Rachel, Phoebe NN #2 Rachel, Chandler Query Rachel, Chandler NN #3 Labeled database of frames from movie Rachel, Phoebe

  23. Nearest neighbors: pros and cons • Pros: • Simple to implement • Flexible to feature / distance choices • Naturally handles multi-class cases • Can do well in practice with enough representative data • Cons: • Large search problem to find nearest neighbors • Storage of data • Must know we have a meaningful distance function

  24. Discriminative classifiers Neural networks Nearest neighbor 106 examples LeCun, Bottou, Bengio, Haffner 1998 Rowley, Baluja, Kanade 1998 … Shakhnarovich, Viola, Darrell 2003 Berg, Berg, Malik 2005... Conditional Random Fields Support Vector Machines Boosting Guyon, Vapnik Heisele, Serre, Poggio, 2001,… Viola, Jones 2001, Torralba et al. 2004, Opelt et al. 2006,… McCallum, Freitag, Pereira 2000; Kumar, Hebert 2003 … Slide adapted from Antonio Torralba

  25. Today • Additional classes well-suited by global appearance representations • Discriminative classifiers • Boosting (last time) • Nearest neighbors • Support vector machines • Application to pedestrian detection • Application to gender classification

  26. Linear classifiers

  27. Lines in R2 Let

  28. Lines in R2 Let

  29. Linear classifiers • Find linear function to separate positive and negative examples Which lineis best?

  30. Support Vector Machines (SVMs) • Discriminative classifier based on optimal separating line (for 2d case) • Maximize the marginbetween the positive and negative training examples

  31. Support vector machines • Want line that maximizes the margin. wx+b=1 wx+b=0 wx+b=-1 For support, vectors, Support vectors Margin C. Burges, A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 1998

  32. Lines in R2 Let

  33. Lines in R2 Let distance from point to line

  34. Lines in R2 Let distance from point to line

  35. Support vector machines • Want line that maximizes the margin. wx+b=1 wx+b=0 wx+b=-1 For support, vectors, Distance between point and line: For support vectors: Support vectors Margin M

  36. Support vector machines • Want line that maximizes the margin. wx+b=1 wx+b=0 wx+b=-1 For support, vectors, Distance between point and line: Therefore, the margin is 2 / ||w|| Support vectors Margin

  37. Finding the maximum margin line • Maximize margin 2/||w|| • Correctly classify all training data points: • Quadratic optimization problem: • Minimize Subject to yi(w·xi+b) ≥ 1 One constraint for each training point. Note sign trick. C. Burges, A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 1998

  38. Finding the maximum margin line • Solution: learnedweight Support vector C. Burges, A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 1998

  39. Finding the maximum margin line • Solution:b = yi – w·xi (for any support vector) • Classification function: • Notice that it relies on an inner product between the testpoint xand the support vectors xi • (Solving the optimization problem also involvescomputing the inner products xi· xjbetween all pairs oftraining points) If f(x) < 0, classify as negative, if f(x) > 0, classify as positive C. Burges, A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 1998

  40. How is the SVM objective different from the boosting objective?

  41. Questions • What if the features are not 2d? • What if the data is not linearly separable? • What if we have more than just two categories?

  42. Questions • What if the features are not 2d? • Generalizes to d-dimensions – replace line with “hyperplane” • What if the data is not linearly separable? • What if we have more than just two categories?

  43. Planes in R3 Let distance from point to plane

  44. Hyperplanes in Rn Hyperplane H is set of all vectors which satisfy: distance from point to hyperplane

  45. Questions • What if the features are not 2d? • What if the data is not linearly separable? • What if we have more than just two categories?

  46. x 0 x 0 x2 Non-linear SVMs • Datasets that are linearly separable with some noise work out great: • But what are we going to do if the dataset is just too hard? • How about… mapping data to a higher-dimensional space: 0 x Slide from Andrew Moore’s tutorial: http://www.autonlab.org/tutorials/svm.html

  47. Another example: Source: Bill Freeman

  48. Another example: Source: Bill Freeman

  49. Non-linear SVMs: Feature spaces • General idea: the original input space can be mapped to some higher-dimensional feature space where the training set is separable: Φ: x→φ(x) Slide from Andrew Moore’s tutorial: http://www.autonlab.org/tutorials/svm.html

More Related