1 / 20

Presenter: Jinhua Du ( 杜金华 ) Xi’an University of Technology 西安理工大学

NLP&CC, Chongqing, Nov. 17-19, 2013. Discriminative Latent Variable Based Classifier for Translation Error Detection. Presenter: Jinhua Du ( 杜金华 ) Xi’an University of Technology 西安理工大学. Outline. Introduction. 2. DPLVM for Translation Error Detection. 3. Experiments and Analysis.

drago
Télécharger la présentation

Presenter: Jinhua Du ( 杜金华 ) Xi’an University of Technology 西安理工大学

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NLP&CC, Chongqing, Nov. 17-19, 2013 Discriminative Latent Variable Based Classifier for Translation Error Detection Presenter: Jinhua Du (杜金华) Xi’an University of Technology 西安理工大学

  2. Outline • Introduction 2. DPLVM for Translation Error Detection 3. Experiments and Analysis 4. Conclusions and Future Work jhdu@xaut.edu.cn

  3. In localization industry, human is always involved in post-editing the MT results; 2. MT errors always increase human cost to obtain a reasonable translation; 3. Translation error detection or word confidence estimation can improve working efficiency of post-editors in some extent. Research Question: how to improve the detection accuracy of detecting translation errors? 1. IntroductionProblem jhdu@xaut.edu.cn

  4. Blatz et al. combined the neural network and a naive Bayes classifier 2004 Ueffing and Ney exhaustively explored various kinds of WPP features 2003/2007 Specia et al. worked on confidence estimation in CAT field 2009/2011 Xiong et al. used a MaxEnt-based classifier to predict translation errors 2010 1. IntroductionRelated Work jhdu@xaut.edu.cn

  5. Classifiers For same feature set, different classifiers show different performance, thus how to select/design a proper classifier is important 1. IntroductionKey Factors Features For a classifier, different features reflect different characteristics of problem, how to select/design a feature set is crucial jhdu@xaut.edu.cn

  6. Title in here Discriminative Latent Variable classifier Title in here Comparison with SVM and MaxEnt 1. IntroductionOur Work Title in here Feature set jhdu@xaut.edu.cn

  7. 2. DPLVM Algorithm • Conditions: • a sequence of observations x = {x1, x2,…, xm} • a sequence of labels y = {y1, y2,…, ym} • Assumption: • a sequence of latent variables h = {h1, h2,…, hm} • Goal: • to learn a mapping between x and y • Definition: (1) jhdu@xaut.edu.cn

  8. Simplified Algorithm • Assumptions: • the model is restricted to have disjoint sets of latent variables associated with each class label; • Each hjis a member in a set Hyjof possible latent variables for the class label yj ; • so sequences which have any will by definition have • Equation (1) can be re-written as: • where (2) (3) jhdu@xaut.edu.cn

  9. Parameter Estimation • Decoding for test set: • Decoding algorithm: • Sun and Tsujii (2009): a latent-dynamic inference (LDI) method based on A* search and dynamic programming; jhdu@xaut.edu.cn

  10. DPLVM in Translation Error Detection Task • Prerequisites: • Types of errors can be classified; • Each class has a specific label; • The classification task can be regarded as a labelling task; • 2 Classes of word label • C: correct • Good words label: c • I: incorrect • Bad words label: i jhdu@xaut.edu.cn

  11. Fixed position based WPP Flexible position based WPP Word alignment based WPP Word Posterior Probabilities Part of speech (POS) word entity Lexical Features word links from LG parser Syntactic Features Feature Set jhdu@xaut.edu.cn

  12. Feature Representation jhdu@xaut.edu.cn

  13. Experimental Settings – SMT system Language pair: Chinese-English Training set: NIST data set,3.4m Devset: NIST MT 2006 current set Testset: NIST MT 2005,2008 sets SMT Performance 3. Experiments and Analysis jhdu@xaut.edu.cn

  14. Data Set and Data Annotation Devset: translations of NIST MT-08 Testset: translations of NIST MT-05 Annotation: TER to determine the true labels for words, 37.99% ratio of correct words for MT-08, 41.59% RCW for MT-05 Evaluation Metrics Evaluation Metrics Experimental Settings for Error Detection Task jhdu@xaut.edu.cn

  15. Comparison (1) Classification Experiments based on Individual Features jhdu@xaut.edu.cn

  16. (2) Classification Experiment on Combined Features jhdu@xaut.edu.cn

  17. Observations The name entities are prone to be wrongly classified The prepositions, conjunctions, auxiliary verbs and articles are easier to be wrongly classified The proportion of the notional words that are wrongly classified is relatively small jhdu@xaut.edu.cn

  18. Presents a new classifier - DPLVM-based classifier -for translation error detection Introduces three different kinds of WPP features, three linguistic features Compares the MaxEnt classifier, SVM classifier and our DPLVM classifier The proposed classifier performs best compared to two other individual classifiers in terms of CER 4. Conclusions and Future WorkConclusions jhdu@xaut.edu.cn

  19. introducing paraphrases to annotate the hypotheses introducing paraphrases to annotate the hypotheses introducing new useful features to further improve the detection capability performing experiments on more language pairs to verify our proposed method. 4. Conclusions and Future WorkFuture Work jhdu@xaut.edu.cn

  20. Thanks for your attention! jhdu@xaut.edu.cn

More Related