1 / 66

Bayesian Learning & Gaussian Mixture Models

Bayesian Learning & Gaussian Mixture Models. Jianping Fan Dept of Computer Science UNC-Charlotte. Basic Classification. Input. Output. Spam vs. Not-Spam. Spam filtering. Binary. !!!!$$$!!!!. Multi-Class. Character recognition. C. C vs. other 25 characters.

claudiab
Télécharger la présentation

Bayesian Learning & Gaussian Mixture Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Learning & Gaussian Mixture Models Jianping Fan Dept of Computer Science UNC-Charlotte

  2. Basic Classification Input Output Spam vs. Not-Spam Spam filtering Binary !!!!$$$!!!! Multi-Class Character recognition C C vs. other 25 characters

  3. Structured Classification Input Output Handwriting recognition Structured output brace building 3D object recognition tree

  4. Overview of Bayesian Decision • Bayesian classification: one example • E.g. How to decide if a patient is sick or healthy, based on • A probabilistic model of the observed data (data distributions) • Prior knowledge (ratio or importance)

  5. Bayes’ Rule: Who is who in Bayes’ rule

  6. Classification problem • Training data: examples of the form (d,h(d)) • where d are the data objects to classify (inputs) • and h(d) are the correct class info for d, h(d){1,…K} • Goal: given dnew, provide h(dnew)

  7. Why Bayesian? • Provides practical learning algorithms • E.g. Naïve Bayes • Prior knowledge and observed data can be combined • It is a generative (model based) approach, which offers a useful conceptual framework • E.g. sequences could also be classified, based on a probabilistic model specification • Any kind of objects can be classified, based on a probabilistic model specification

  8. Gaussian Mixture Model (GMM)

  9. Gaussian Mixture Model (GMM)

  10. Gaussian Mixture Model (GMM)

  11.  Univariate Normal Sample Sampling

  12.  Maximum Likelihood Sampling We want to maximize it. Given x, it is a function of  and 2

  13. Log-Likelihood Function Maximize this instead By setting and

  14. Max. the Log-Likelihood Function

  15. Max. the Log-Likelihood Function

  16.  Miss Data Missing data Sampling

  17. E-Step be the estimated parameters at the initial of the tth iterations Let

  18. E-Step be the estimated parameters at the initial of the tth iterations Let

  19. M-Step be the estimated parameters at the initial of the tth iterations Let

  20. n= 40 (10 data missing) Estimate using different initial conditions. 375.081556 362.275902 332.612068 351.383048 304.823174 386.438672 430.079689 395.317406 369.029845 365.343938 243.548664 382.789939 374.419161 337.289831 418.928822 364.086502 343.854855 371.279406 439.241736 338.281616 454.981077 479.685107 336.634962 407.030453 297.821512 311.267105 528.267783 419.841982 392.684770 301.910093 Exercise

  21. Multinomial Population Sampling Nsamples

  22. Maximum Likelihood Sampling Nsamples

  23. Maximum Likelihood Sampling Nsamples We want to maximize it.

  24. Log-Likelihood

  25. Mixed Attributes Sampling Nsamples x3 is not available

  26. E-Step Sampling Nsamples x3 is not available Given (t), what can you say about x3?

  27. M-Step

  28. Exercise Estimate using different initial conditions?

  29. # Children n6 n2 n3 n4 n5 n1 Married Obasongs Unmarried Obasongs (No Children) Binomial/Poison Mixture M: married obasong X: # Children n0 # Obasongs

  30. # Children n6 n2 n3 n4 n5 n1 Married Obasongs Unmarried Obasongs (No Children) Binomial/Poison Mixture M: married obasong X: # Children n0 # Obasongs Unobserved data: nA : # married Ob’s nB : # unmarried Ob’s

  31. # Children n6 n6 n2 n2 n3 n3 n4 n4 n5 n5 n1 n1 pA, pB p1 p2 p3 p4 p5 p6 Probability Binomial/Poison Mixture M: married obasong X: # Children n0 # Obasongs Complete data

  32. # Children n6 n6 n2 n2 n3 n3 n4 n4 n5 n5 n1 n1 pA, pB p1 p2 p3 p4 p5 p6 Probability Binomial/Poison Mixture n0 # Obasongs Complete data

  33. # Children n6 n6 n2 n2 n3 n3 n4 n4 n5 n5 n1 n1 pA, pB p1 p2 p3 p4 p5 p6 Probability Complete Data Likelihood n0 # Obasongs Complete data

  34. Maximum Likelihood

  35. Latent Variables Incomplete Data Complete Data

  36. Complete Data  Complete Data Likelihood

  37. Complete Data Complete Data Likelihood A function of latent variable Y and parameter  A function of parameter  A function of random variable Y. The result is in term of random variable Y. Computable If we are given ,

  38. Expectation Step Let (i1) be the parameter vector obtained at the (i1)th step. Define

  39. Maximization Step Let (i1) be the parameter vector obtained at the (i1)th step. Define

  40. Mixture Models • If there is a reason to believe that a data set is comprised of several distinct populations, a mixture model can be used. • It has the following form: with

  41. Mixture Models Let yi{1,…, M} represents the source that generates the data.

  42. Mixture Models Let yi{1,…, M} represents the source that generates the data.

  43. Mixture Models

  44. Mixture Models

  45. Expectation Zero when yi l

  46. Expectation

  47. Expectation

  48. Expectation 1

  49. Maximization Given the initial guess g, We want to find , to maximize the above expectation. In fact, iteratively.

  50. The GMM (Guassian Mixture Model) Guassian model of a d-dimensional source, say j : GMM with M sources:

More Related