1 / 33

A Probabilistic Framework for Structure-based Alignment

A Probabilistic Framework for Structure-based Alignment. Kurohashi-lab M2 56430 Toshiaki Nakazawa. Outline. Introduction of Machine Translation What is Alignment? Statistical Machine Translation (SMT) Example-based Machine Translation (EBMT) Baseline alignment method

ramona
Télécharger la présentation

A Probabilistic Framework for Structure-based Alignment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Probabilistic Framework for Structure-based Alignment Kurohashi-lab M2 56430 Toshiaki Nakazawa

  2. Outline • Introduction of Machine Translation • What is Alignment? • Statistical Machine Translation (SMT) • Example-based Machine Translation (EBMT) • Baseline alignment method • A probabilistic framework for alignment • Corresponding Pattern score (CP-score) • Integration of Maximum Entropy (ME) • Experiments and results • Discussion and conclusion

  3. Outline • Introduction of Machine Translation • What is Alignment? • Statistical Machine Translation (SMT) • Example-based Machine Translation (EBMT) • Baseline alignment method • A probabilistic framework for alignment • Corresponding Pattern score (CP-score) • Integration of Maximum Entropy (ME) • Experiments and results • Discussion and conclusion

  4. Standard Way of Machine Translation Input Resource Parallel Corpus Translation Alignment Output Parallel Corpus: Text which is written in two different languages but the content is almost same. Alignment: To find the correspondence between two parallel sentences. (word level, phrase level, etc…) The performance of alignment affects the accuracy of translation.

  5. Statistical Machine Translation (SMT) • Learn models for translation from parallel corpus statistically • Not use any linguistic resources • Small translation unit (= “word”) • Recently, the number of studies handling bigger unit (= “couple of words” or “phrase”) is increasing • Require large parallel corpus for highly-accurate translation

  6. Language Model Translation Model Basic Method for SMT • Translate by maximizing the probability: Ex) IBM Model [Brown et al., 93] Learn from a parallel corpus (usually with unsupervised learning algorithm)

  7. Overview of EBMT Translation Memory DataBase Input TMDB Parallel Corpus Translation Alignment Output Advanced NLP technologies

  8. Example-based Machine Translation (EBMT) • Divide the input sentence into a few parts • Find a similar expressions (examples) from parallel corpus for each parts • Combine the examples to generate output translation • Use any linguistic resources as much as possible • Larger translation unit (larger example) is better

  9. Flow of EBMT

  10. SMT vs. EBMT We introduce a probabilistic framework for structure-based alignment.

  11. Outline • Introduction of Machine Translation • What is Alignment? • Statistical Machine Translation (SMT) • Example-based Machine Translation (EBMT) • Baseline alignment method • A probabilistic framework for alignment • Corresponding Pattern score (CP-score) • Integration of Maximum Entropy (ME) • Experiments and results • Discussion and conclusion

  12. the car came at me 交差 from the side 点 で 、 at the intersection 突然 あの 車 が 飛び出して 来た のです J: 交差点で、突然あの車が 飛び出して来たのです。 E:The car came at me from the side at the intersection. Alignment • Transformation into dependency structure J: JUMAN/KNP E: Charniak’s nlparser → Dependency tree

  13. the car came at me 交差 from the side 点 で 、 at the intersection 突然 あの 車 が 飛び出して 来た のです Alignment • Transformation into dependency structure • Detection of word(s) correspondences • Bilingual dictionaries • Transliteration detection • ローズワイン → rosuwain ⇔ rose wine (similarity:0.78) • 新宿 → shinjuku ⇔ shinjuku (similarity:1.0)

  14. the car came at me 交差 from the side 点 で 、 at the intersection 突然 あの 車 が 飛び出して 来た のです Alignment • Transformation into dependency structure • Detection of word(s) correspondences • Disambiguation of correspondences

  15. 日本 で you 保険 will have 会社 に to file 対して insurance 保険 an claim 請求の insurance 申し立て が with the office in Japan 可能です よ Disambiguation 1/2 + 1/1 Cunamb → Camb : 1/(Distance in J tree) + 1/(Distance in E tree)

  16. the car came at me 交差 from the side 点 で 、 at the intersection 突然 あの 車 が 飛び出して 来た のです Alignment • Transformation into dependency structure • Detection of word(s) correspondences • Disambiguation of correspondences • Handling of remaining phrases

  17. the car came at me 交差 from the side 点 で 、 at the intersection 突然 あの 車 が 飛び出して 来た のです Alignment • Transformation into dependency structure • Detection of word(s) correspondences • Disambiguation of correspondences • Handling of remaining phrases • Registration to translation example database

  18. Outline • Introduction of Machine Translation • What is Alignment? • Statistical Machine Translation (SMT) • Example-based Machine Translation (EBMT) • Baseline alignment method • A probabilistic framework for alignment • Corresponding Pattern score (CP-score) • Integration of Maximum Entropy (ME) • Experiments and results • Discussion and conclusion

  19. Corresponding Pattern (CP)

  20. Corresponding Pattern (CP)

  21. Corresponding Pattern (CP) (1, 2) (1, 1) (1, 2, 1, 1) (0, 2) (0, 1) (0, 2, 0, 1) (0, 1) (0, 1) (0, 1, 0, 1)

  22. CP-score • Assign a score to each CP = CP-score • Calculation of CP-score • Count the frequency of each CP • Using the aligned parallel corpus by the baseline alignment method • Divide the frequency by the total frequency of all CPs (CP-score is a probability of occurrence) • Alignment Score (AS) by CP-score

  23. Alignment Disambiguation by AS Adopt the alignment with highest AS

  24. Outline • Introduction of Machine Translation • What is Alignment? • Statistical Machine Translation (SMT) • Example-based Machine Translation (EBMT) • Baseline alignment method • A probabilistic framework for alignment • Corresponding Pattern score (CP-score) • Integration of Maximum Entropy (ME) • Experiments and results • Discussion and conclusion

  25. Maximum Entropy (ME) • The principle of maximum entropy: • a method for analyzing the available information in order to determine a unique epistemic probability distribution. (by WIKIPEDIA)

  26. Maximum Entropy (ME) • The principle of maximum entropy: • a method for analyzing the available information in order to determine a unique epistemic probability distribution. (by WIKIPEDIA) • Alignment probability with ME [Och et al,. 02] S: Source sentence T: Target sentence A: Alignment : Feature function : Model parameter

  27. Feature Functions • Alignment Score (AS) • Parse score (Jap. and Eng.) • Depth pattern score (DP-score) • Probability of lexicon (Jap. and Eng.) • Coverage of the correspondences (Jap. and Eng.) • Average size of the correspondences (Jap. and Eng.)

  28. Outline • Introduction of Machine Translation • What is Alignment? • Statistical Machine Translation (SMT) • Example-based Machine Translation (EBMT) • Baseline alignment method • A probabilistic framework for alignment • Corresponding Pattern score (CP-score) • Integration of Maximum Entropy (ME) • Experiments and results • Discussion and conclusion

  29. Experiments • Select 500 moderately long sentences from BTEC corpus of IWSLT2005 training data set • Manually annotate phrase-to-phrase alignment • Conducted 5-fold cross validation • 400 for training and 100 for testing • Calculated the F-measure P: Precision R: Recall

  30. Results

  31. Discussion • Not considering clause • Correspondences in the same clause of source sentence are likely to be in the same clause of target sentence • Sentence complexity • Proposed method works effectively for long and complex sentences • Preciseness of dictionary • Erroneous correspondence by the dictionary makes bad effects on alignment

  32. Conclusion • Proposed a probabilistic framework to improve structure-based alignment • Proposed a new criteria CP-score for evaluating alignment • Integrate the ME model into alignment approach

  33. Future Work • Sophisticate the CP and CP-score • Consider clauses • Select the feature functions • Test our method on other corpora • Longer and more complex sentences

More Related