1 / 13

CSSE463: Image Recognition Day 31

Learn how Bayesian classifiers work in image recognition, including calculating probabilities, setting thresholds, and utilizing prior information. Explore the impact of Bayesian probabilities and the Bayes rule on classifiers. Discover how to apply Bayesian classifiers for indoor-outdoor classification and image orientation detection, using features like color, texture, and camera metadata. Dive into Bayesian networks and conditional probability distributions for efficient classification. Get insights from IEEE TPAMI paper on orientation detection and single-feature Bayesian classifiers.

capril
Télécharger la présentation

CSSE463: Image Recognition Day 31

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSSE463: Image Recognition Day 31 • Today: Bayesian classifiers • Questions?

  2. Bayesian classifiers • Use training data • Calculate the probabilities of each feature. • If 2 classes: • Classes w1 and w2 • Say, circles vs. non-circles • A single feature, x • Both classes equally likely • Both types of errors equally bad • Where should we set the threshold between classes? Here? • Where in graph are 2 types of errors? p(x) P(x|w1) P(x|w2) Circles Non-circles x Detected as circles Q1-4

  3. What if we have prior information? • Bayesian probabilities say that if we only expect 10% of the objects to be circles, that should affect our classification Q5-8

  4. Bayesian classifier in general • Bayes rule: • Verify with example • For classifiers: • x = feature(s) • wi = class • P(w|x) = posterior probability • P(w) = prior • P(x) = unconditional probability • Find best class by maximum a posteriori (MAP) priniciple. Find class ithat maximizes P(wi|x). • Denominator doesn’t affect calculations • Example: • indoor/outdoor classification Fixed Learned from examples (histogram) Learned from training set (or leave out if unknown)

  5. Indoor vs. outdoor classification • I can use low-level image info (color, texture, etc) • But there’s another source of really helpful info!

  6. Camera Metadata Distributions Subject Distance Scene Brightness Flash Exposure Time Subject Distance

  7. Why we need Bayes Rule Problem: We know conditional probabilities like P(flash was on | indoor) We want to find conditional probabilities like P(indoor | flash was on, exp time = 0.017, sd=8 ft, SVM output) Let w = class of image, and x = all the evidence. More generally, we know P( x | w ) from the training set (why?) But we want P(w | x)

  8. Using Bayes Rule P(w|x) = P(x|w)P(w)/P(x) The denominator is constant for an image, so P(w|x) = aP(x|w)P(w) Q9

  9. Using Bayes Rule P(w|x) = P(x|w)P(w)/P(x) The denominator is constant for an image, so P(w|x) = aP(x|w)P(w) We have two types of features, from image metadata (M) and from low-level features, like color (L) Conditional independence means P(x|w) = P(M|w)P(L|w) P(w|X) = aP(M|w) P(L|w) P(w) Priors (initial bias) From histograms From SVM

  10. Bayesian network • Efficient way to encode conditional probability distributions and calculate marginals • Use for classification by having the classification node at the root • Examples • Indoor-outdoor classification • Automatic image orientation detection

  11. Indoor vs. outdoor classification Each edge in the graph has an associated matrix of conditional probabilities SVM SVM Color Features Texture Features KL Divergence EXIF header

  12. Effects of Image Capture Context Recall for a class C is fraction of C classified correctly

  13. Orientation detection • See IEEE TPAMI paper • Hardcopy or posted • Also uses single-feature Bayesian classifier (answer to #1-4) • Keys: • 4-class problem (North, South, East, West) • Priors really helped here! • You should be able to understand the two papers (both posted)

More Related