1 / 14

Feature-Based Classification & Principle Component Analysis

Feature-Based Classification & Principle Component Analysis. Another Approach to Feature Based Classification. Offline: Collect examples of each class Determine features of each example Store the resulting features as points in “feature space” Online: Get a new instance

gerek
Télécharger la présentation

Feature-Based Classification & Principle Component Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Feature-Based Classification & Principle Component Analysis

  2. Another Approach to Feature Based Classification • Offline: • Collect examples of each class • Determine features of each example • Store the resulting features as points in “feature space” • Online: • Get a new instance • Determine its features • Choose the class that’s closest in “feature space”

  3. Advantages / Disadvantages + Simple to compute + Avoids partitioning feature space (classes can overlap) – Leaves the possibility of large “unmapped” areas in feature space – heavily dependent on good sample set – Highly dependent on good feature set

  4. Example: Letter Classification • Classifying Capital Letters with features: • Holes • A = 1, B = 2, C = 0 • Ends (does not count angles like the top of A) • A = 2, B = 0, C = 2, F = 3 • Straight • A = 3, B = 1, C = 0, D = 1 • Curve • A = 0, B = 2, C = 1, D = 1

  5. Feature Classification Example

  6. Classifying a New Letter • New letter is ø • Holes = • Ends = • Straight = • Curve = • Distance to “A” (1, 2, 3, 0) is: • Distance to “D” (1, 0, 1, 1) is:

  7. Continuing the Example

  8. Evaluating the Features • Does a slight modification of a letter still classify to the same letter? (Generalization) • Are all different letters distinguishable? (Representation) • Are all features independent and useful? • How can we modify this feature set to improve the representation?

  9. Multiple Examples per Class • Improves robustness (why?) • Increases space / time requirements (why?) • How can we gain benefits without too much cost? • K nearest neighbors • Clustering • Partitioning the space (as we saw before)

  10. Recognizing Sign Language Letters “A” “E” Also “I”, “O”, and “U”

  11. Input • 30 x 32 image of a hand signing the letter (grayscale) • 960 pixels, values 0-255 • We have a data set of 30 images per letter for the 5 vowels

  12. Features?

  13. Most Easily Available Features • Pixels from the image • 960 features per image! • These are very easy to compute, but we hope we don’t really need all of them • Maybe linear combinations (e.g. total of upper half) would be useful • How can we find out which linear combinations of pixels are the most useful features? • How can we decide how many (few) features we need?

  14. To Be Continued… • Smith, L.I., A Tutorial on Principal Component Analysis. February, 2002. http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf • Shiens, J., A Tutorial on Principal Component Analysis. December, 2005. http://www.snl.salk.edu/~shlens/pub/notes/pca.pdf • Kirby, M., and L. Sirovich. "Application of the Karhunen-Loeve Procedure for the Characterization of Human Faces." IEEE Trans. Patt. Anal. Mach. Intell. 12.1 (1990): 103-108.

More Related