Presentation is loading. Please wait.

Presentation is loading. Please wait.

Phrase Extraction in PB-SMT Ankit K Srivastava NCLT/CNGL Presentation: May 6, 2009.

Similar presentations


Presentation on theme: "Phrase Extraction in PB-SMT Ankit K Srivastava NCLT/CNGL Presentation: May 6, 2009."— Presentation transcript:

1 Phrase Extraction in PB-SMT Ankit K Srivastava NCLT/CNGL Presentation: May 6, 2009

2 Phrase Extraction | Ankit | 6-May-09 2 About Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote

3 Phrase Extraction | Ankit | 6-May-09 3 PB-SMT Modeling Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote

4 Phrase Extraction | Ankit | 6-May-09 4 PB-SMT Process sequence of words as opposed to mere words Segment input, translate input, reorder output Translation model, Language Model, Decoder argmax e p(e|f) = argmax e p(f|e) p(e)

5 Phrase Extraction | Ankit | 6-May-09 5 Learning Phrase Translations Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote

6 Phrase Extraction | Ankit | 6-May-09 6 Extraction I Input is sentence-aligned parallel corpora Most approaches use word alignments Extract (learn) phrase pairs Build a phrase translation table

7 Phrase Extraction | Ankit | 6-May-09 7 Extraction II Get word alignments (src2tgt, tgt2src) Perform grow-diag-final heuristics Extract phrase pairs consistent with the word alignments Non-syntactic phrases :: STR [Koehn et al., ’03]

8 Phrase Extraction | Ankit | 6-May-09 8 Extraction III Sentence-aligned and word-aligned text Monolingual parsing of both SRC & TGT Align subtrees and extract string pairs Syntactic phrases

9 Phrase Extraction | Ankit | 6-May-09 9 Extraction IV Parse using constituency parser Phrases are syntactic constituents :: CON [Tinsley et al., ’07] (ROOT (S (NP (NNP Vinken)) (VP (MD will) (VP (VB join) (NP (DT the) (NN board)) (PP (IN as) (NP (DT a) (JJ nonexecutive) (NN director))) (NP (NNP Nov) (CD 29))))))

10 Phrase Extraction | Ankit | 6-May-09 10 Extraction V Parse using dependency parser Phrases have head-dependent relationships :: DEP [Hearne et al., ’08] HEADDEPENDENT joinVinken joinwill boardthe joinboard joinas directora directornonexecutive asdirector 29Nov join29

11 Phrase Extraction | Ankit | 6-May-09 11 Extraction VI Numerous other phrase extractions Estimate phrase translations directly [Marcu & Wong ’02] Use heuristic other than grow-diag-final Use marker-based chunks [Groves & Way ’05] String-to-String translation models herein

12 Phrase Extraction | Ankit | 6-May-09 12 Head Percolation and Phrase Extraction Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote

13 Phrase Extraction | Ankit | 6-May-09 13 Percolation I It is straightforward to convert constituency tree to an unlabeled dependency tree [Gaifman ’65] Use head percolation tables to identify head child in a constituency representation[Magerman ’95] Dependency tree is obtained by recursively applying head child and non-head child heuristics [Xia & Palmer ’01]

14 Phrase Extraction | Ankit | 6-May-09 14 Percolation II (NP (DT the) (NN board)) NP right NN/NNP/CD/JJ (NP-board (DT the) (NN board)) the is dependent on board

15 Phrase Extraction | Ankit | 6-May-09 15 Percolation III (ROOT (S (NP (NNP Vinken)) (VP (MD will) (VP (VB join) (NP (DT the) (NN board)) (PP (IN as) (NP (DT a) (JJ nonexecutive) (NN director))) (NP (NNP Nov) (CD 29)))))) HEADDEPENDENT joinVinken joinwill boardthe joinboard joinas directora directornonexecutive asdirector 29Nov join29 NP right NN / NNP / CD / JJ PP left IN / PP S right VP / S VP left VB / VP INPUT OUTPUT

16 Phrase Extraction | Ankit | 6-May-09 16 Percolation IV cf. slide - Extraction III (syntactic phrases) Parse by applying head percolation tables on constituency-annotated trees Align trees, extract surface chunks Phrases have head-dependent relations :: PERC

17 Phrase Extraction | Ankit | 6-May-09 17 Tools, Resources, and MT System Performance Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote

18 Phrase Extraction | Ankit | 6-May-09 18 System setup I RESOURCE TYPENAMEDETAILS CorporaJOC EUROPARL Chiao et al., ‘06 Koehn, ’05 ParsersBerkeley Parser Syntex Parser Head Percolation Petrov et al., ’06 Bourigault et al.,’05 Xia & Palmer ‘01 Alignment ToolsGIZA++ Phrase Heuristics Tree Aligner Och & Ney ’03 Koehn et al., ‘03 Zhechev ‘09 Lang ModelingSRILM ToolkitStolcke ‘02 DecoderMosesKoehn et al., ‘07 Evaluation ScriptsBLEU NIST METEOR, WER, PER Papineni et al., ’02, Doddington ’02, Banerjee & Lavie ‘05

19 Phrase Extraction | Ankit | 6-May-09 19 System setup II All 4 “systems” are run with the same configurations (with MERT tuning) on 2 different datasets They only differ in their phrase tables (# chunks) CORPORATRAINDEVTEST JOC7,723400599 EUROPARL100,0001,8892,000 CORPORASTRCONDEPPERC JOC236 K79 K74 K72 K EUROPARL2145 K663 K583K565 K

20 Phrase Extraction | Ankit | 6-May-09 20 System setup III SYSTEMBLEUNISTMETEORWERPER On JOC (7K) data 31.296.3163.9161.0947.34 30.646.3463.8260.7245.99 30.756.3164.1261.3446.77 29.196.0962.1262.6948.21 On EUROPARL (100K) data STR28.507.0057.8357.4344.11 CON25.646.5555.2660.7746.82 DEP25.246.5954.6560.7346.51 PERC25.876.5955.6360.7646.48

21 Phrase Extraction | Ankit | 6-May-09 21 Analyzing Str, Con, Dep, and Perc Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote Analysis w.r.t. Europarl data only

22 Phrase Extraction | Ankit | 6-May-09 22 Analysis I No. of common & unique phrase pairs Maybe we should combine the phrase tables… Phrase Types Common to both Unique in 1 st type Unique in 2 nd type DEP & PERC369K213K195K CON & PERC492K171K72K STR & PERC127K2,018K437K CON & DEP391K271K191K STR & DEP128K2,016K454K STR & CON144K2,000K518K

23 Phrase Extraction | Ankit | 6-May-09 23 Analysis II Concatenate phrase tables and re-estimate probabilities 15 different phrase table combinations: ∑4Cr, 1≤r≤4 STR + CON + DEP + PERC UNIBITRIQUAD SSC, SD, SPSCD, SCP, SDPSCDP CCD, CPCDP- DDP-- P---

24 Phrase Extraction | Ankit | 6-May-09 24 Analysis III All 15 “systems” are run with the same configurations (with MERT tuning) They only differ in their phrase tables This is combining at “translation model” level

25 Phrase Extraction | Ankit | 6-May-09 25 Analysis IV Performance on Europarl

26 Phrase Extraction | Ankit | 6-May-09 26 Analysis V REF: Does the commission intend to seek more transparency in this area? S: Will the commission ensure that more than transparency in this respect? C: The commission will the commission ensure greater transparency in this respect? D: The commission will the commission ensure greater transparency in this respect? P: Does the commission intend to ensure greater transparency in this regard? SC: Will the commission ensure that more transparent in this respect? SD: Will the commission ensure that more transparent in this respect? SP: Does the commission intend to take to ensure that more than openness in this regard? CD: The commission will the commission ensure greater transparency in this respect? CP: The commission will the commission ensure greater transparency in this respect? DP: The commission will the commission ensure greater transparency in this respect? SCD: Does the commission intend to take to ensure that more transparent commit? SCP: Does the commission intend to take in this regard to ensure greater transparency? SDP: Does the commission intend to take in this regard to ensure greater transparency? CDP: The commission will the commission ensure greater transparency in this respect? SCDP: Does the commission intend to take to ensure that more transparent suspected?

27 Phrase Extraction | Ankit | 6-May-09 27 Analysis VI Which phrases does the decoder use? Decoder trace on S+C+D+P Out of 11,748 phrases: S(5204); C(2441); D(2319); P(2368)

28 Phrase Extraction | Ankit | 6-May-09 28 Analysis VII Automatic per-sentence evaluation using TER on testset of 2000 sentences [Snover et al., ’06] C (1120); P (331); D (301); S (248) Manual per-sentence evaluation on a random testset of 100 sentences using pairwise system comparison P=C (27%); P>D (5%); SC>SCP(11%)

29 Phrase Extraction | Ankit | 6-May-09 29 Analysis VIII Treat the different phrase table combinations as individual MT systems Perform system combination using MBR-CN framework [Du et al., 2009] This is combining at “system” level SYSTEMBLEUNISTMETEORWERPER STR29.467.1158.8756.4343.03 CON28.936.7957.3458.5444.83 DEP28.386.8156.5958.6144.74 PERC29.276.8257.7258.3744.53 ||MBR||29.526.8557.8458.1344.40 ||CN||30.707.0658.5255.8742.86

30 Phrase Extraction | Ankit | 6-May-09 30 Analysis IX Using Moses baseline phrases (STR) is essential for coverage. SIZE matters! However, adding any system to STR increases baseline score. Symbiotic! Hence, do not replace STR, but supplement it.

31 Phrase Extraction | Ankit | 6-May-09 31 Analysis X CON seems to be the best combination with STR (S+C seems to be the best performing system) Has most common chunks with PERC Does PERC harm a CON system – needs more analysis (bias between CON & PERC)

32 Phrase Extraction | Ankit | 6-May-09 32 Analysis XI DEP is different from PERC chunks, despite being equivalent in syntactic representation DEP can be substituted by PERC Difference between knowledge induced from dependency and constituency. A different aligner?

33 Phrase Extraction | Ankit | 6-May-09 33 Analysis XII PERC is a unique knowledge source. Is it just a simple case of parser combination? Sometimes, it helps. Needs more work on finding connection with CON / DEP

34 Phrase Extraction | Ankit | 6-May-09 34 Customizing Moses for syntax- supplemented phrase tables Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote

35 Phrase Extraction | Ankit | 6-May-09 35 Moses customization Incorporating syntax (CON, DEP, PERC) Reordering model Phrase scoring (new features) Decoder Parameters Log-linear combination of T-tables Good phrase translations may be lost by the decoder. How can we ensure they remain intact? M OSES

36 Phrase Extraction | Ankit | 6-May-09 36 Work in Progress and Future Plans Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote

37 Phrase Extraction | Ankit | 6-May-09 37 Ongoing & future work Scaling (data size) (lang. pair) (lang. dir.) Bias between CON & PERC Combining Phrase pairs Combining Systems Classify performance into sentence types Improve quality of phrase pairs in PBSMT

38 Phrase Extraction | Ankit | 6-May-09 38 Endnote… Phrase-based statistical machine translation Methods for phrase extraction Phrase induction via percolated dependencies Experimental setup & evaluation results Other facts & figures Moses customization Ongoing & future work Endnote

39 Phrase Extraction | Ankit | 6-May-09 39 Endnote Explored 3 linguistically motivated phrase extractions against Moses phrases Improves baseline. Highest recorded is 10% relative increase in BLEU on 100K Rather than pursuing ONE way, combine options Need more analysis of supplementing phrase table with multiple syntactic T-tables

40 Phrase Extraction | Ankit | 6-May-09 40 Thank You!

41 Phrase Extraction | Ankit | 6-May-09 41 Phrase Extraction in PB-SMT P hrase-based Statistical Machine Translation (PB-SMT) models – the most widely researched paradigm in MT today – rely heavily on the quality of phrase pairs induced from large amounts of training data. There are numerous methods for extracting these phrase translations from parallel corpora. In this talk I will describe phrase pairs induced from percolated dependencies and contrast them with three pre-existing phrase extractions. I will also present the performance of the individual phrase tables and their combinations in a PB-SMT system. I will then conclude with ongoing experiments and future research directions.

42 Phrase Extraction | Ankit | 6-May-09 42 Thanks! Andy Way Patrik Lambert John Tinsley Sylwia Ozdowska Ventisislav Zhechev Sergio Penkale Jinhua Du


Download ppt "Phrase Extraction in PB-SMT Ankit K Srivastava NCLT/CNGL Presentation: May 6, 2009."

Similar presentations


Ads by Google