1 / 35

Topic Model Latent Dirichlet Allocation

Topic Model Latent Dirichlet Allocation. Ouyang Ruofei. May. 10 2013. Ouyang Ruofei. LDA. Introduction. Parameters:. Inference:. data = latent pattern + noise. Ouyang Ruofei. LDA. Introduction. Parametric Model:. Number of parameters is fixed w.r.t . sample size.

leane
Télécharger la présentation

Topic Model Latent Dirichlet Allocation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Topic Model Latent Dirichlet Allocation Ouyang Ruofei May. 10 2013 Ouyang Ruofei LDA

  2. Introduction Parameters: Inference: data = latent pattern + noise Ouyang Ruofei LDA

  3. Introduction Parametric Model: Number of parameters is fixed w.r.t. sample size Nonparametric Model: Number of parameters grows with sample size Infinite dimensional parameter space Ouyang Ruofei LDA

  4. Clustering 1.Ironman 2.Thor 3.Hulk Indicator variable for each data point Ouyang Ruofei LDA

  5. Dirichletprocess Ironman: 3 times Thor: 2 times Hulk: 2 times Without the likelihood, we know that: 1. There are three clusters 2. The distribution over three clusters New data Ouyang Ruofei LDA

  6. Dirichletprocess Example: Dirichlet distribution: Dir(Ironman,Thor,Hulk) pdf: mean: Ouyang Ruofei LDA

  7. Dirichletprocess Conjugate prior Multinomial distribution: Dirichlet distribution: Posterior: Example: Pseudo count Ouyang Ruofei LDA

  8. Dirichletprocess In our Avengers model, K=3 (Ironman, Thor, Hulk) However, this guy comes… Dirichlet distribution can’t model this stupid guy Dirichlet process: K = infinity Nonparametrics here mean infinite number of clusters Ouyang Ruofei LDA

  9. Dirichlet process Dirichlet process: α: Pseudo counts in each cluster G0: Base distribution of each cluster Distribution template Given any partition A distribution over distributions Ouyang Ruofei LDA

  10. Dirichlet process Construct Dirichlet process by CRP Chinese restaurant process: In a restaurant, there are infinite number of tables. Costumer 1 seats at an unoccupied table with p=1. Costumer N seats at table k with p= Ouyang Ruofei LDA

  11. Dirichlet process Ouyang Ruofei LDA

  12. Dirichlet process Ouyang Ruofei LDA

  13. Dirichlet process Ouyang Ruofei LDA

  14. Dirichlet process Ouyang Ruofei LDA

  15. Dirichlet process Customers : data Tables : clusters Ouyang Ruofei LDA

  16. Dirichlet process Train the model by Gibbs sampling Ouyang Ruofei LDA

  17. Dirichlet process Train the model by Gibbs sampling Ouyang Ruofei LDA

  18. Gibbs sampling Gibbssampling is a MCMC method to obtain a sequence of observations from a multivariate distribution The intuition is to turn a multivariate problem into a sequence of univariate problem. In Dirichlet process, Multivariate: Univariate: Ouyang Ruofei LDA

  19. Gibbs sampling Gibbs sampling pseudo code: Ouyang Ruofei LDA

  20. Topic model Document Mixture of topics Latent variable But, we can read words topics words Ouyang Ruofei LDA

  21. Topic model Ouyang Ruofei LDA

  22. Topic model Ouyang Ruofei LDA

  23. Topic model topic of xij observed word word/topic count topic/doc count other topics other words Ouyang Ruofei LDA

  24. Topic model Apply Dirichlet process in topic model Learn the distribution of topics in a document Learn the distribution of topics for a word Ouyang Ruofei LDA

  25. Topic model topic/doc table word/topic table Ouyang Ruofei LDA

  26. Topic model Latent Dirichlet allocation: Dirichlet mixture model: Ouyang Ruofei LDA

  27. LDA Example d1: ipad apple itunes d2: apple mirror queen d3: queen joker ladygaga d4: queen ladygaga mirror w: ipad apple itunes mirror queen joker ladygaga t1: product In fact, the topics are latent t2: story t3: poker Ouyang Ruofei LDA

  28. LDA example 1 2 3 d1: ipad apple itunes 2 1 2 d2: apple mirror queen 3 3 1 d3: queen joker ladygaga 2 1 2 d4: queen ladygaga mirror Ouyang Ruofei LDA

  29. LDA example 1 2 3 d1: ipad apple itunes 2 1 2 d2: apple mirror queen 3 1 d3: joker ladygaga queen 2 1 2 d4: queen ladygaga mirror Ouyang Ruofei LDA

  30. LDA example 1 2 3 d1: ipad apple itunes 2 1 2 d2: apple mirror queen 3 1 d3: joker ladygaga queen 2 1 2 d4: queen ladygaga mirror Ouyang Ruofei LDA

  31. LDA example 1 2 3 d1: ipad apple itunes 2 1 2 d2: apple mirror queen 3 1 d3: joker ladygaga queen 2 1 2 d4: queen ladygaga mirror Ouyang Ruofei LDA

  32. LDA example 1 2 3 d1: ipad apple itunes 2 1 2 d2: apple mirror queen 2 3 1 d3: joker ladygaga queen 2 1 2 d4: queen ladygaga mirror Ouyang Ruofei LDA

  33. Further Dirichlet distribution prior: K topics Supervised Unsupervised Dirichlet process prior: infinite topics Alpha mainly controls the probability of a topic with few training data in the document. Beta mainly controls the probability of a topic with few training data in the words. Ouyang Ruofei LDA

  34. Further Unrealistic bag of words assumption TNG, biLDA Lose power law behavior Pitman Yor language model David Blei has done an extensive survey on topic model http://home.etf.rs/~bfurlan/publications/SURVEY-1.pdf Ouyang Ruofei LDA

  35. Q&A Ouyang Ruofei LDA

More Related