1 / 26

Posterior Regularization for Structured Latent Variable Models

Posterior Regularization for Structured Latent Variable Models. Li Z honghua I2R SMT Reading Group. Outline. Motivation and Introduction Posterior Regularization Application Implementation Some Related Frameworks. Motivation and Introduction. Prior Knowledge

krysta
Télécharger la présentation

Posterior Regularization for Structured Latent Variable Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Posterior Regularization for Structured Latent Variable Models Li Zhonghua I2R SMT Reading Group

  2. Outline • Motivation and Introduction • Posterior Regularization • Application • Implementation • Some Related Frameworks

  3. Motivation and Introduction Prior Knowledge We posses a wealth of prior knowledge about most NLP tasks.

  4. Motivation and Introduction--Prior Knowledge

  5. Motivation and Introduction--Prior Knowledge

  6. Motivation and Introduction Leveraging Prior Knowledge Possible approaches and their limitations

  7. Motivation and Introduction--Limited Approach Bayesian Approach : Encode prior knowledge with a prior on parameters • Limitation:Our prior knowledge is not about parameters! • Parameters are difficult to interpret; hard to get desired effect.

  8. Motivation and Introduction--Limited Approach Augmenting Model : Encode prior knowledge with additional variables and dependencies. limitation: may make exact inference intractable

  9. Posterior Regularization • A declarative language for specifying prior knowledge -- Constraint Features & Expectations • Methods for learning with knowledge in this language -- EM style learning algorithm

  10. Posterior Regularization

  11. Posterior Regularization Original Objective :

  12. Posterior Regularization EM style learning algorithm

  13. Posterior Regularization Computing the Posterior Regularizer

  14. Application Statistical Word Alignments IBM Model 1 and HMM

  15. Application One feature for each source word m, that counts how many times it is aligned to a target word in the alignment y.

  16. Application Define feature for each target-source position pair i,j . The feature takes the value zero in expectation if a word pair i ,j is aligned with equal probability in both directions.

  17. Application Learning Tractable Word Alignment Models with Complex Constraints CL10

  18. Application • Six language pairs • both types of constraints improve over the HMM in terms of both precision and recall • improve over the HMM by 10% to 15% • S-HMM performs slightly better than B-HMM • S-HMM performs better than B-HMM in 10 out of 12 cases • improve over IBM M4 9 times out of 12

  19. Application

  20. Implementation • http://code.google.com/p/pr-toolkit/

  21. Some Related Frameworks

  22. Some Related Frameworks

  23. Some Related Frameworks

  24. Some Related Frameworks

  25. Some Related Frameworks

  26. more info: http://sideinfo.wikkii.com many of my slides get from there Thanks!

More Related