1 / 30

Pattern Recognition: Statistical and Neural

Nanjing University of Science & Technology. Pattern Recognition: Statistical and Neural. Lonnie C. Ludeman Lecture 13 Oct 14, 2005. Lecture 13 Topics. 1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space

Télécharger la présentation

Pattern Recognition: Statistical and Neural

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nanjing University of Science & Technology Pattern Recognition:Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005

  2. Lecture 13 Topics 1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space 2. Calculation of P(error) for 2-class case : several special cases 3. P(error) calculations examples for special cases – 2-class case

  3. Example 1: Multiple observation - multiple classes Given: the pattern vector x is composed of N independent observations of a Gaussian random variable X with the class conditional densities as follows for each component A zero one cost function is given as Find: (a) the Bayes decision rule in a sufficient statistic space. (b) the Bayes decision rule in a space of likelihood ratios

  4. Solution: (a) Since the observations are independent the joint conditional density is a product of the marginal densities and given by for i = 1, 2, 3 and mi = i, i=1, 2, 3 Bayes decision rule is determined form a set of yi(x) defined for M=3 by

  5. Substituting the given properties gives The region to decide C1is found by setting the following inequalities Therefore the region R1 to decide C1, reduces to the x that satisfy

  6. Similarly the regions R2 and R3 become Substituting the conditional densities, taking the ln of both sides and simplifying the decision rule reduces to regions in a sufficient statistic s space as follows

  7. Which is shown below in the sufficient statistic s space s An intuitively pleasing result !

  8. (b) Bayes Decision Rule in Likelihood ratio space: M-Class Case derivation We know that Bayes Decision Rule for the M-Class Case is if yi(x) < yj(x) for all j = i Then decide x is from Ci M where yi(x) = Cijp(x | Cj) P(Cj) j=1

  9. Dividing through by p(x | CM) gives sufficient statistics vi(x) as follows LM(x) = p(x | CM) / p(x | CM) = 1 Therefore the decision rule becomes

  10. Bayes Decision Rule in the Likelihood Ratio Space The dimension of the Likelihood Ratio Space is always one less than the number of classes ( M - 1)

  11. Back to Example: Define the likelihood ratios as We have already determined the region to decide C1 as Dividing both sides of the inequalities by p(x|C3) gives the following equations in the Likelihood Ratio space for determining C1

  12. The other regions are determined in the same fashion giving the decision regions in the likelihood ratio space

  13. Calculation of Probability of error for the 2-class Gaussian Cases Special Case 1: We know Optimum Bayes Decision Rule is given by

  14. The sufficient statistic Z conditioned on C1 has the following mean and variance

  15. The conditional variance becomes thus under C1 we have : Z ~ N( a1, v1 ) a1= v1=

  16. Similarly the conditional mean and variance under class C2 are a2= v2= The statistic Z under class C2 is Gaussian and given by thus under C1 we have : Z ~ N( a2, v2 )

  17. Determination of the P(error) The total Probability Theorem states where

  18. Since the scalar Z is Gaussian the error conditioned on C1 becomes:

  19. Similarly the error conditioned on C2 becomes Finally the total P(error) becomes for Special Case 1

  20. Special case 2: Equal scaled identity Covariance matrices Using the previous formula the P(error) reduces to where (Euclidean distance between the means)

  21. Special case 3: Zero- one Bayes Costs and Equal apriori probabilities Using the previous formula for P(error) gives:

  22. Special Case 4: Then

  23. Example: Calculation of probability of Error Given: Find: P(error) for the following assumptions

  24. (a) Solution:

  25. (b) Solution: Substituting the above into the P(error) gives:

  26. (c) Solution: Substituting the above into the P(error) gives:

  27. (d) Solution: Substituting the above into the P(error) for the case of equal covariance matrices gives:

  28. (d) Solution Continued:

  29. Lecture 13 Summary 1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space 2. Calculation of P(error) for two class case : special cases 3. P(error) calculations examples for special cases - 2 class case

  30. End of Lecture 13

More Related