1 / 10

Generative Models

Generative Models. for probabilistic inference. Michael Stewart. Remember the Joint Distribution?. What about very large/complex models?. "Generative" Modeling. Implement probability theory in computer science to infer a joint distribution

kaia
Télécharger la présentation

Generative Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Generative Models for probabilistic inference Michael Stewart

  2. Remember the Joint Distribution?

  3. What about very large/complex models?

  4. "Generative" Modeling Implement probability theory in computer science to infer a joint distribution      Bayesian prior -> posterior provides learning opportunity      Sampling methods are their own field of study Applications in neuroscience, machine learning, biology [1] Requires a model and sampling method...

  5. "Generate" examples

  6. Define a Model This is a broad procedure; what is the goal? • Bio:  • observe fMRI data, infer latent locality function in brain • observe genome data, infer latent gene relationships • AI: • observe words, infer latent topics or semantic information

  7. How to sample Rejection sampling: noUsually a kind of Gibbs/MCMC

  8. Some implementations: • [2] BLOG • [3] Church • [4] Python modules provided by Tom Haines

  9. Recommended reading   A. Daud, J. Li, L. Zhou, and F. Muhammad, "Knowledge discovery through directed probabilistic topic models: a survey," Frontiers of Computer Science in China, vol. 4, no. 2, pp. 280-301, Jun. 2010. [Online]. Available: http://dx.doi.org/10.1007/s11704-009-0062-y  M. Steyvers and T. Griffiths, Probabilistic Topic Models. Lawrence Erlbaum Associates, 2007. [Online]. Available: http://www.worldcat.org/isbn/1410615340 N. Goodman, J. Tenenbaum, T. O'Donnell, and the Church Working Group. Probabilistic Models of Cognition http://projects.csail.mit.edu/church/wiki/Probabilistic_Models_of_Cognition Coin examples: Church Learning as Conditional Inference

  10. References [1] A. Venkataraman, Y. Rathi, M. Kubicki, C.-F. Westin, and P. Golland, "Joint generative model for fMRI/DWI and its application to population studies." Medical Image Computing and Computer-Assisted Intervention, vol. 13, no. Pt 1, pp. 191-199, 2010. [Online]. Available: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3056120&tool=pmcentrez&rendertype=abstract  [2] B. Milch, B. Marthi, S. Russell, D. Sontag, D. L. Ong, and A. Kolobov, "Blog: Probabilistic models with unknown objects," in In IJCAI, 2005, pp. 1352-1359. [Online]. Available: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.116.2131  [3] N. D. Goodman, V. K. Mansinghka, D. Roy, K. Bonawitz, and J. B. Tenenbaum, "Church: a language for generative models," in Uncertainty in Artificial Intelligence, 2008. [Online]. Available: http://web.mit.edu/droy/www/papers/GooManRoyBonTenUAI2008.pdf [4] http://code.google.com/p/haines/ plate notation example: M. Steyvers and T. Griffiths, Probabilistic Topic Models. Lawrence Erlbaum Associates, 2007. [Online]. Available: http://www.worldcat.org/isbn/1410615340

More Related