1 / 20

Wrapper Syntax for Example-Based Machine Translation

Wrapper Syntax for Example-Based Machine Translation. Karolina Owczarzak, Bart Mellebeek, Declan Groves, Josef Van Genabith, Andy Way National Centre for Language Technology School of Computing Dublin City University. Overview. TransBooster – wrapper technology for MT motivation

season
Télécharger la présentation

Wrapper Syntax for Example-Based Machine Translation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Wrapper Syntax for Example-Based Machine Translation Karolina Owczarzak, Bart Mellebeek, Declan Groves, Josef Van Genabith, Andy Way National Centre for Language Technology School of Computing Dublin City University

  2. Overview • TransBooster – wrapper technology for MT • motivation • decomposition process • variables and template contexts • recomposition • Example-Based Machine Translation • marker-based EBMT • Experiment • English-Spanish • Europarl, Wall Street Journal section of Penn II Treebank • automatic and manual evaluation • Comparison with previous experiments

  3. TransBooster – wrapper technology for MT • Assumption: MT systems perform better at translating short sentences than long ones. • Decompose long sentences into shorter and syntactically simpler chunks, send to translation, recompose on output • Decomposition linguistically guided by syntactic parse of the sentence

  4. TransBooster – wrapper technology for MT • TransBooster technology is universal and can be applied to any MT system • Experiments to date: • TB and Rule-Based MT (Mellebeek et al., 2005a,b) • TB and Statistical MT (Mellebeek et al., 2006a) • TB and Multi-Engine MT (Mellebeek et al., 2006b) • TransBooster outperforms baseline MT systems

  5. TransBooster – decomposition • Input – syntactically parsed sentence (Penn II format) • Decompose into pivot and satellites • pivot: usually main predicate (plus additional material) • satellites: arguments and adjuncts • Recursively decompose satellites if longer than x leaves • Replace satellites around pivot with variables • static: simple same-type phrases with known translation • dynamic: simplified version of original satellites • send off to translation • Insert each satellite into a template context • static: simple predicate with known translation • dynamic: simpler version of original clause (pivot + simplified arguments, no adjuncts) • send off to translation

  6. TransBooster – decomposition example (S (NP (NP (DT the) (NN chairman)) (, ,) (NP (NP (DT a) (JJ long-time) (NN rival)) (PP (IN of) (NP (NNP Bill) (NNP Gates)))) (, ,)) (VP (VBZ likes) (NP (ADJP (JJ fast) (CC and) (JJ confidential)) (NNS deals))) (. .)) [The chairman, a long-time rival of Bill Gates,]ARG1 [likes]pivot [fast and confidential deals]ARG2. [The man]V1[likes]pivot [cars]V2. [The chairman, a long-time rival of Bill Gates,]ARG1 [is sleeping]V1. [The man sees]V1[fast and confidential deals]ARG2. [The chairman]V1[likes]pivot [deals]V2. [The chairman, a long-time rival of Bill Gates,]ARG1 [likes deals]V1. [The chairman likes]V1[fast and confidential deals]ARG2. MT engine

  7. TransBooster – recomposition • MT output: a set of translations with dynamic and static variables and contexts for a sentence S • Remove translations of dynamic variables and contexts from translation of S • If unsuccessful, back off to translation with static variables and contexts, remove those • Recombine translated pivot and satellites into output sentence

  8. TransBooster – recomposition example The chairman, a long-time rival of Bill Gates,likesfast and confidential deals. [The chairman]V1 [likes]pivot [deals]V2. -> El presidente tiene gusto de repartos. [The chairman, a long-time rival of Bill Gates,]ARG1 [likes deals]V1. -> El presidente, un rival de largo plazo de Bill Gates, tiene gusto de repartos. [The chairman likes]V1 [fast and confidential deals]ARG2. -> El presidente tiene gusto de repartos rápidos y confidenciales. [The man]V1 [likes]pivot [cars]V2. -> El hombre tiene gusto de automóviles. [The chairman, a long-time rival of Bill Gates,]ARG1 [is sleeping]V1. -> El presidente, un rival de largo plazo de Bill Gates, está durmiendo. [The man sees]V1 [fast and confidential deals]ARG2. -> El hombre ve repartos rápidos y confidenciales. [El presidente, un rival de largo plazo de Bill Gates,] [tiene gusto de] [repartos rápidos y confidenciales]. Original translation: El presidente, rival de largo plazo de Bill Gates, gustosayuna y los repartos confidenciales.

  9. EBMT – Overview • An aligned bilingual corpus • Input text is matched against this corpus • The best match is found and a translation is produced EX (input) search F2 F4 FX (output)

  10. EBMT – Marker-Based Chunking <DET> = {the,a,these……} <DET> = {le,la,l’,une,un,ces…..} <PREP> = {on, of …} <PREP> = {sur, d’ ..} English phrase : on virtually all uses of asbestos French translation: sur virtuellement tous usages d’asbeste <PREP> on virtually<DET> all uses<PREP>of asbestos <PREP>sur virtuellement<DET> tous usages<PREP> d’ asbeste Marker Chunks: <PREP> on virtually : sur virtuellement <DET> all uses : tous usages <PREP> of asbestos : d’asbeste Lexical Chunks: <LEX> on : sur <LEX>virtually : virtuellement <LEX> all : tous <LEX>uses : usages <LEX> of : d’ <LEX>asbestos : asbeste

  11. EBMT – System Overview

  12. Experiment • English -> Spanish • Two test sets: • Wall Street Journal section of Penn II Treebank 800 sentences • Europarl 800 sentences • “Out-of-domain” factor: • TransBooster developed on perfect Penn II trees • EBMT trained on 958K English-Spanish Europarl sentences

  13. Europarl Wall Street Journal BLEU BLEU NIST NIST EBMT EBMT 0.1098 0.2111 5.9243 4.9081 TransBooster TransBooster 0.2134 0.1140 5.9342 4.9321 Percent of Baseline Percent of Baseline 101% 103.8% 100.5% 100.2% Experiment – Results Automatic evaluation Results for EBMT vs TransBooster on 741-sentence test set from Europarl. Results for EBMT vs TransBooster on 800-sentence test set from Penn II Treebank.

  14. Experiment - Results Manual evaluation • 100 randomly selected sentences from EP test set: • source English sentence • EBMT translation • EBMT + TransBooster translation • 3 judges, native speakers of Spanish fluent in English • Accuracy and fluency: relative scale for comparing the two translations Inter-judge agreement (Kappa): Fluency > 0.948, Accuracy > 0.926 Absolute quality gain when using TransBooster: Fluency 19.33% of sentences Accuracy 15.67% of sentences

  15. Experiment – Results TB improvements: Example 1 Source:women have decided that they wish to work, that they wish to make their work compatible with their family life. EBMT:hemos decidido su deseo de trabajar, su deseo de hacer su trabajo compatible con su vida familiar. empresarias TB:mujeres han decidido su deseo de trabajar, su deseo de hacer su trabajo compatible con su vida familiar. Example 2 Source: if this global warming continues, then part of the territory of the eu member states will become sea or desert. EBMT: si esto continúa calentamiento global, tanto dentro del territorio de los estados miembros tendrán tornarse altamar o desértico TB: si esto calentamiento global perdurará, entonces parte del territorio de los estados miembros de la unión europea tendrán tornarse altamar o desértico

  16. TB vs. SMT: WSJ TB vs. RBMT: WSJ TB vs. SMT: EP TB vs. EBMT: EP TB vs. EBMT: WSJ BLEU BLEU BLEU BLEU BLEU NIST NIST NIST NIST NIST SMT EBMT Rule-Based MT EBMT SMT 0.3108 0.1098 0.1986 0.1343 0.2111 4.9081 5.9243 5.1432 5.8393 7.3428 TransBooster TransBooster TransBooster TransBooster TransBooster 0.3163 0.2134 0.2052 0. 1379 0.1140 4.9321 7.3901 5.8766 5.1259 5.9342 TransBooster vs. SMT on 800-sentence test set from Penn II Treebank. % of Baseline % of Baseline % of Baseline % of Baseline % of Baseline 101% 101.7% 103.3% 102.7% 103.8% 99.7% 100.6% 100.5% 100.2% 100.6% Results for TransBooster vs. Rule-Based MT on 800-sentence test set from Penn II Treebank. Previous experiments TransBooster vs. SMT on 800-sentence test set from Europarl. TransBooster vs. EBMT on 800-sentence test set from Europarl. TransBooster vs. EBMT on 800-sentence test set from Penn II Treebank.

  17. TB vs. SMT: EP TB vs. EBMT: EP BLEU BLEU NIST NIST SMT EBMT 0.2111 0.1986 5.8393 5.9243 TransBooster TransBooster 0.2052 0.2134 5.8766 5.9342 % of Baseline % of Baseline 103.3% 101% 100.2% 100.6% Previous experiments TransBooster vs. SMT on 800-sentence test set from Europarl. TransBooster vs. EBMT on 800-sentence test set from Europarl.

  18. TB vs. RBMT: WSJ TB vs. EBMT: WSJ TB vs. SMT: WSJ BLEU BLEU BLEU NIST NIST NIST EBMT SMT Rule-Based MT 0.3108 0.1098 0.1343 7.3428 4.9081 5.1432 TransBooster TransBooster TransBooster 0.1140 0. 1379 0.3163 7.3901 4.9321 5.1259 % of Baseline % of Baseline % of Baseline 103.8% 101.7% 102.7% 100.5% 100.6% 99.7% Previous experiments TransBooster vs. Rule-Based MT on 800-sentence test set from Penn II Treebank. TransBooster vs. SMT on 800-sentence test set from Penn II Treebank. TransBooster vs. EBMT on 800-sentence test set from Penn II Treebank.

  19. Summary • TransBooster is a universal technology to decompose and recompose MT text • Net improvement in translation quality against EBMT: Fluency 19.33% of sentences Accuracy 15.67% of sentences • Successful experiments to date: rule-based MT, phrase-based SMT, multi-engine MT, EBMT • Journal article in preparation

  20. Thank You

More Related