Presentation is loading. Please wait.

Presentation is loading. Please wait.

Nov 17, 2005Learning-based MT1 Learning-based MT Approaches for Languages with Limited Resources Alon Lavie Language Technologies Institute Carnegie Mellon.

Similar presentations


Presentation on theme: "Nov 17, 2005Learning-based MT1 Learning-based MT Approaches for Languages with Limited Resources Alon Lavie Language Technologies Institute Carnegie Mellon."— Presentation transcript:

1 Nov 17, 2005Learning-based MT1 Learning-based MT Approaches for Languages with Limited Resources Alon Lavie Language Technologies Institute Carnegie Mellon University Joint work with: Jaime Carbonell, Lori Levin, Kathrin Probst, Erik Peterson, Christian Monson, Ariadna Font-Llitjos, Alison Alvarez, Roberto Aranovich

2 Nov 17, 2005Learning-based MT2 Outline Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Learning Morphology Example prototypes Implications for MT with vast parallel data Conclusions and future directions

3 Nov 17, 2005Learning-based MT3 Machine Translation: Where are we today? Age of Internet and Globalization – great demand for MT: –Multiple official languages of UN, EU, Canada, etc. –Documentation dissemination for large manufacturers (Microsoft, IBM, Caterpillar) Economic incentive is still primarily within a small number of language pairs Some fairly good commercial products in the market for these language pairs –Primarily a product of rule-based systems after many years of development Pervasive MT between most language pairs still non- existent and not on the immediate horizon

4 Nov 17, 2005Learning-based MT4 Mi chiamo Alon LavieMy name is Alon Lavie Give-information+personal-data (name=alon_lavie) [ s [ vp accusative_pronoun “chiamare” proper_name]] [ s [ np [possessive_pronoun “name”]] [ vp “be” proper_name]] Direct Transfer Interlingua Analysis Generation Approaches to MT: Vaquois MT Triangle

5 Nov 17, 2005Learning-based MT5 Progression of MT Started with rule-based systems –Very large expert human effort to construct language- specific resources (grammars, lexicons) –High-quality MT extremely expensive  only for handful of language pairs Along came EBMT and then SMT… –Replaced human effort with extremely large volumes of parallel text data –Less expensive, but still only feasible for a small number of language pairs –We “traded” human labor with data Where does this take us in 5-10 years? –Large parallel corpora for maybe 25-50 language pairs What about all the other languages? Is all this data (with very shallow representation of language structure) really necessary? Can we build MT approaches that learn deeper levels of language structure and how they map from one language to another?

6 Nov 17, 2005Learning-based MT6 Why Machine Translation for Languages with Limited Resources? We are in the age of information explosion –The internet+web+Google  anyone can get the information they want anytime… But what about the text in all those other languages? –How do they read all this English stuff? –How do we read all the stuff that they put online? MT for these languages would Enable: –Better government access to native indigenous and minority communities –Better minority and native community participation in information-rich activities (health care, education, government) without giving up their languages. –Civilian and military applications (disaster relief) –Language preservation

7 Nov 17, 2005Learning-based MT7 The Roadmap to Learning-based MT Automatic acquisition of necessary language resources and knowledge using machine learning methodologies: –Learning morphology (analysis/generation) –Rapid acquisition of broad coverage word-to-word and phrase-to-phrase translation lexicons –Learning of syntactic structural mappings Tree-to-tree structure transformations [Knight et al], [Eisner], [Melamed] require parse trees for both languages Learning syntactic transfer rules with resources (grammar, parses) for just one of the two languages –Automatic rule refinement and/or post-editing A framework for integrating the acquired MT resources into effective MT prototype systems Effective integration of acquired knowledge with statistical/distributional information

8 Nov 17, 2005Learning-based MT8 CMU’s AVENUE Approach Elicitation: use bilingual native informants to produce a small high-quality word-aligned bilingual corpus of translated phrases and sentences –Building Elicitation corpora from feature structures –Feature Detection and Navigation Transfer-rule Learning: apply ML-based methods to automatically acquire syntactic transfer rules for translation between the two languages –Learn from major language to minor language –Translate from minor language to major language XFER + Decoder: –XFER engine produces a lattice of possible transferred structures at all levels –Decoder searches and selects the best scoring combination Rule Refinement: refine the acquired rules via a process of interaction with bilingual informants Morphology Learning Word and Phrase bilingual lexicon acquisition

9 Nov 17, 2005Learning-based MT9 AVENUE Architecture Learning Module Transfer Rules {PP,4894} ;;Score:0.0470 PP::PP [NP POSTP] -> [PREP NP] ((X2::Y1) (X1::Y2)) Translation Lexicon Run Time Transfer System Lattice Decoder English Language Model Word-to-Word Translation Probabilities Word-aligned elicited data

10 Nov 17, 2005Learning-based MT10 Outline Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Learning Morphology Example prototypes Implications for MT with vast parallel data Conclusions and future directions

11 Nov 17, 2005Learning-based MT11 Data Elicitation for Languages with Limited Resources Rationale: –Large volumes of parallel text not available  create a small maximally-diverse parallel corpus that directly supports the learning task –Bilingual native informant(s) can translate and align a small pre-designed elicitation corpus, using elicitation tool –Elicitation corpus designed to be typologically and structurally comprehensive and compositional –Transfer-rule engine and new learning approach support acquisition of generalized transfer-rules from the data

12 Nov 17, 2005Learning-based MT12 Elicitation Tool: English-Chinese Example

13 Nov 17, 2005Learning-based MT13 Elicitation Tool: English-Chinese Example

14 Nov 17, 2005Learning-based MT14 Elicitation Tool: English-Hindi Example

15 Nov 17, 2005Learning-based MT15 Elicitation Tool: English-Arabic Example

16 Nov 17, 2005Learning-based MT16 Elicitation Tool: Spanish-Mapudungun Example

17 Nov 17, 2005Learning-based MT17 Designing Elicitation Corpora What do we want to elicit? –Diversity of linguistic phenomena and constructions –Syntactic structural diversity How do we construct an elicitation corpus? –Typological Elicitation Corpus based on elicitation and documentation work of field linguists (e.g. Comrie 1977, Bouquiaux 1992): initial corpus size ~1000 examples –Structural Elicitation Corpus based on representative sample of English phrase structures: ~120 examples Organized compositionally: elicit simple structures first, then use them as building blocks Goal: minimize size, maximize linguistic coverage

18 Nov 17, 2005Learning-based MT18 Typological Elicitation Corpus Feature Detection –Discover what features exist in the language and where/how they are marked Example: does the language mark gender of nouns? How and where are these marked? –Method: compare translations of minimal pairs – sentences that differ in only ONE feature Elicit translations/alignments for detected features and their combinations Dynamic corpus navigation based on feature detection: no need to elicit for combinations involving non-existent features

19 Nov 17, 2005Learning-based MT19 Typological Elicitation Corpus Initial typological corpus of about 1000 sentences was manually constructed New construction methodology for building an elicitation corpus using: –A feature specification: lists inventory of available features and their values –A definition of the set of desired feature structures Schemas define sets of desired combinations of features and values Multiplier algorithm generates the comprehensive set of feature structures –A generation grammar and lexicon: NLG generator generates NL sentences from the feature structures

20 Nov 17, 2005Learning-based MT20 Structural Elicitation Corpus Goal: create a compact diverse sample corpus of syntactic phrase structures in English in order to elicit how these map into the elicited language Methodology: –Extracted all CFG “rules” from Brown section of Penn TreeBank (122K sentences) –Simplified POS tag set –Constructed frequency histogram of extracted rules –Pulled out simplest phrases for most frequent rules for NPs, PPs, ADJPs, ADVPs, SBARs and Sentences –Some manual inspection and refinement Resulting corpus of about 120 phrases/sentences representing common structures See [Probst and Lavie, 2004]

21 Nov 17, 2005Learning-based MT21 Outline Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Learning Morphology Example prototypes Implications for MT with vast parallel data Conclusions and future directions

22 Nov 17, 2005Learning-based MT22 Transfer Rule Formalism Type information Part-of-speech/constituent information Alignments x-side constraints y-side constraints xy-constraints, e.g. ((Y1 AGR) = (X1 AGR)) ; SL: the old man, TL: ha-ish ha-zaqen NP::NP [DET ADJ N] -> [DET N DET ADJ] ( (X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2) ((X1 AGR) = *3-SING) ((X1 DEF = *DEF) ((X3 AGR) = *3-SING) ((X3 COUNT) = +) ((Y1 DEF) = *DEF) ((Y3 DEF) = *DEF) ((Y2 AGR) = *3-SING) ((Y2 GENDER) = (Y4 GENDER)) )

23 Nov 17, 2005Learning-based MT23 Transfer Rule Formalism (II) Value constraints Agreement constraints ;SL: the old man, TL: ha-ish ha-zaqen NP::NP [DET ADJ N] -> [DET N DET ADJ] ( (X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2) ((X1 AGR) = *3-SING) ((X1 DEF = *DEF) ((X3 AGR) = *3-SING) ((X3 COUNT) = +) ((Y1 DEF) = *DEF) ((Y3 DEF) = *DEF) ((Y2 AGR) = *3-SING) ((Y2 GENDER) = (Y4 GENDER)) )

24 Nov 17, 2005Learning-based MT24 Rule Learning - Overview Goal: Acquire Syntactic Transfer Rules Use available knowledge from the source side (grammatical structure) Three steps: 1.Flat Seed Generation: first guesses at transfer rules; flat syntactic structure 2.Compositionality Learning: use previously learned rules to learn hierarchical structure 3.Constraint Learning: refine rules by learning appropriate feature constraints

25 Nov 17, 2005Learning-based MT25 Flat Seed Rule Generation Learning Example: NP Eng: the big apple Heb: ha-tapuax ha-gadol Generated Seed Rule: NP::NP [ART ADJ N]  [ART N ART ADJ] ((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2))

26 Nov 17, 2005Learning-based MT26 Flat Seed Rule Generation Create a “flat” transfer rule specific to the sentence pair, partially abstracted to POS –Words that are aligned word-to-word and have the same POS in both languages are generalized to their POS –Words that have complex alignments (or not the same POS) remain lexicalized One seed rule for each translation example No feature constraints associated with seed rules (but mark the example(s) from which it was learned)

27 Nov 17, 2005Learning-based MT27 Compositionality Learning Initial Flat Rules: S::S [ART ADJ N V ART N]  [ART N ART ADJ V P ART N] ((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2) (X4::Y5) (X5::Y7) (X6::Y8)) NP::NP [ART ADJ N]  [ART N ART ADJ] ((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2)) NP::NP [ART N]  [ART N] ((X1::Y1) (X2::Y2)) Generated Compositional Rule: S::S [NP V NP]  [NP V P NP] ((X1::Y1) (X2::Y2) (X3::Y4))

28 Nov 17, 2005Learning-based MT28 Compositionality Learning Detection: traverse the c-structure of the English sentence, add compositional structure for translatable chunks Generalization: adjust constituent sequences and alignments Two implemented variants: –Safe Compositionality: there exists a transfer rule that correctly translates the sub-constituent –Maximal Compositionality: Generalize the rule if supported by the alignments, even in the absence of an existing transfer rule for the sub-constituent

29 Nov 17, 2005Learning-based MT29 Constraint Learning Input: Rules and their Example Sets S::S [NP V NP]  [NP V P NP] {ex1,ex12,ex17,ex26} ((X1::Y1) (X2::Y2) (X3::Y4)) NP::NP [ART ADJ N]  [ART N ART ADJ] {ex2,ex3,ex13} ((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2)) NP::NP [ART N]  [ART N] {ex4,ex5,ex6,ex8,ex10,ex11} ((X1::Y1) (X2::Y2)) Output: Rules with Feature Constraints: S::S [NP V NP]  [NP V P NP] ((X1::Y1) (X2::Y2) (X3::Y4) (X1 NUM = X2 NUM) (Y1 NUM = Y2 NUM) (X1 NUM = Y1 NUM))

30 Nov 17, 2005Learning-based MT30 Constraint Learning Goal: add appropriate feature constraints to the acquired rules Methodology: –Preserve general structural transfer –Learn specific feature constraints from example set Seed rules are grouped into clusters of similar transfer structure (type, constituent sequences, alignments) Each cluster forms a version space: a partially ordered hypothesis space with a specific and a general boundary The seed rules in a group form the specific boundary of a version space The general boundary is the (implicit) transfer rule with the same type, constituent sequences, and alignments, but no feature constraints

31 Nov 17, 2005Learning-based MT31 Constraint Learning: Generalization The partial order of the version space: Definition: A transfer rule tr 1 is strictly more general than another transfer rule tr 2 if all f- structures that are satisfied by tr 2 are also satisfied by tr 1. Generalize rules by merging them: –Deletion of constraint –Raising two value constraints to an agreement constraint, e.g. ((x1 num) = *pl), ((x3 num) = *pl)  ((x1 num) = (x3 num))

32 Nov 17, 2005Learning-based MT32 Automated Rule Refinement Bilingual informants can identify translation errors and pinpoint the errors A sophisticated trace of the translation path can identify likely sources for the error and do “Blame Assignment” Rule Refinement operators can be developed to modify the underlying translation grammar (and lexicon) based on characteristics of the error source: –Add or delete feature constraints from a rule –Bifurcate a rule into two rules (general and specific) –Add or correct lexical entries See [Font-Llitjos, Carbonell & Lavie, 2005]

33 Nov 17, 2005Learning-based MT33 Outline Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Learning Morphology Example prototypes Implications for MT with vast parallel data Conclusions and future directions

34 Nov 17, 2005Learning-based MT34 Morphology Learning Goal: Unsupervised learning of morphemes and their function from raw monolingual data –Segmentation of words into morphemes –Identification of morphological paradigms (inflections and derivations) –Learning association between morphemes and their function in the language Organize the raw data in the form of a network of paradigm candidate schemes Search the network for a collection of schemes that represent true morphology paradigms of the language Learn mappings between the schemes and features/functions using minimal pairs of elicited data Construct analyzer based on the collection of schemes and the acquired function mappings

35 Nov 17, 2005Learning-based MT35 Ø.s blame solve Example Vocabulary blame blamed blames roamed roaming roams solve solves solving Ø.s.d blame s blame roam solve e.es blam solv me.mes bla

36 e.es blam solv e.ed blam es blam solv Ø.s.d blame Ø.s blame solve Ø blame blames blamed roams roamed roaming solve solves solving e.es.ed blam ed blam roam d blame roame Ø.d blame s.d blame s blame roam solve es.ed blam e blam solv me.mes bla me.med bla mes bla me.mes.med bla med bla roa mes.med bla me bla 36

37 a.as.o.os 43 african, cas, jurídic, l,... a.as.o.os.tro 1 cas a.as.os 50 afectad, cas, jurídic, l,... a.as.o 59 cas, citad, jurídic, l,... a.o.os 105 impuest, indonesi, italian, jurídic,... a.as 199 huelg, incluid, industri, inundad,... a.os 134 impedid, impuest, indonesi, inundad,... as.os 68 cas, implicad, inundad, jurídic,... a.o 214 id, indi, indonesi, inmediat,... as.o 85 intern, jurídic, just, l,... a.tro 2 cas.cen a 1237 huelg, ib, id, iglesi,... as 404 huelg, huelguist, incluid, industri,... os 534 humorístic, human, hígad, impedid,... o 1139 hub, hug, human, huyend,... tro 16 catas, ce, cen, cua,... as.o.os 54 cas, implicad, jurídic, l,... o.os 268 human, implicad, indici, indocumentad,... Spanish Newswire Corpus 40,011 Tokens 6,975 Types 37

38 a.as.o.os 43 african, cas, jurídic, l,... a.as.o.os.tro 1 cas a.as.os 50 afectad, cas, jurídic, l,... a.as.o 59 cas, citad, jurídic, l,... a.o.os 105 impuest, indonesi, italian, jurídic,... a.as 199 huelg, incluid, industri, inundad,... a.os 134 impedid, impuest, indonesi, inundad,... as.os 68 cas, implicad, inundad, jurídic,... a.o 214 id, indi, indonesi, inmediat,... as.o 85 intern, jurídic, just, l,... a.tro 2 cas.cen a 1237 huelg, ib, id, iglesi,... as 404 huelg, huelguist, incluid, industri,... os 534 humorístic, human, hígad, impedid,... o 1139 hub, hug, human, huyend,... tro 16 catas, ce, cen, cua,... as.o.os 54 cas, implicad, jurídic, l,... o.os 268 human, implicad, indici, indocumentad,... C-Suffixes C-Stems Level 5 = 5 C-suffixes C-Stem Type Count 38

39 a.as.o.os 43 african, cas, jurídic, l,... a.as.o.os.tro 1 cas a.tro 2 cas.cen tro 16 catas, ce, cen, cua,... Adjective Inflection Class 39 a.as.os 50 afectad, cas, jurídic, l,... a.as.o 59 cas, citad, jurídic, l,... a.o.os 105 impuest, indonesi, italian, jurídic,... a.as 199 huelg, incluid, industri, inundad,... a.os 134 impedid, impuest, indonesi, inundad,... as.os 68 cas, implicad, inundad, jurídic,... a.o 214 id, indi, indonesi, inmediat,... as.o 85 intern, jurídic, just, l,... a 1237 huelg, ib, id, iglesi,... as 404 huelg, huelguist, incluid, industri,... os 534 humorístic, human, hígad, impedid,... o 1139 hub, hug, human, huyend,... as.o.os 54 cas, implicad, jurídic, l,... o.os 268 human, implicad, indici, indocumentad,... From the spurious c-suffix “tro”

40 a.as.o.os.tro 1 cas a.tro 2 cas.cen tro 16 catas, ce, cen, cua,... a.as.o.os 43 african, cas, jurídic, l,... a.as.os 50 afectad, cas, jurídic, l,... a.as.o 59 cas, citad, jurídic, l,... a.o.os 105 impuest, indonesi, italian, jurídic,... a.as 199 huelg, incluid, industri, inundad,... a.os 134 impedid, impuest, indonesi, inundad,... as.os 68 cas, implicad, inundad, jurídic,... a.o 214 id, indi, indonesi, inmediat,... as.o 85 intern, jurídic, just, l,... a 1237 huelg, ib, id, iglesi,... as 404 huelg, huelguist, incluid, industri,... os 534 humorístic, human, hígad, impedid,... o 1139 hub, hug, human, huyend,... as.o.os 54 cas, implicad, jurídic, l,... o.os 268 human, implicad, indici, indocumentad,... 40 Decreasing C-Stem Count Increasing C-Suffix Count Basic Search Procedure

41 Nov 17, 2005Learning-based MT41 Outline Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Learning Morphology Example prototypes Implications for MT with vast parallel data Conclusions and future directions

42 Nov 17, 2005Learning-based MT42 AVENUE Prototypes General XFER framework under development for past three years Prototype systems so far: –German-to-English, Dutch-to-English –Chinese-to-English –Hindi-to-English –Hebrew-to-English In progress or planned: –Mapudungun-to-Spanish –Quechua-to-Spanish –Arabic-to-English –Native-Brazilian languages to Brazilian Portuguese

43 Nov 17, 2005Learning-based MT43 Challenges for Hebrew MT Puacity in existing language resources for Hebrew –No publicly available broad coverage morphological analyzer –No publicly available bilingual lexicons or dictionaries –No POS-tagged corpus or parse tree-bank corpus for Hebrew –No large Hebrew/English parallel corpus Scenario well suited for CMU transfer-based MT framework for languages with limited resources

44 Nov 17, 2005Learning-based MT44 Hebrew-to-English MT Prototype Initial prototype developed within a two month intensive effort Accomplished: –Adapted available morphological analyzer –Constructed a preliminary translation lexicon –Translated and aligned Elicitation Corpus –Learned XFER rules –Developed (small) manual XFER grammar as a point of comparison –System debugging and development –Evaluated performance on unseen test data using automatic evaluation metrics

45 Nov 17, 2005Learning-based MT45 Morphology Example Input word: B$WRH 0 1 2 3 4 |--------B$WRH--------| |-----B-----|$WR|--H--| |--B--|-H--|--$WRH---|

46 Nov 17, 2005Learning-based MT46 Morphology Example Y0: ((SPANSTART 0) Y1: ((SPANSTART 0) Y2: ((SPANSTART 1) (SPANEND 4) (SPANEND 2) (SPANEND 3) (LEX B$WRH) (LEX B) (LEX $WR) (POS N) (POS PREP)) (POS N) (GEN F) (GEN M) (NUM S) (NUM S) (STATUS ABSOLUTE)) (STATUS ABSOLUTE)) Y3: ((SPANSTART 3) Y4: ((SPANSTART 0) Y5: ((SPANSTART 1) (SPANEND 4) (SPANEND 1) (SPANEND 2) (LEX $LH) (LEX B) (LEX H) (POS POSS)) (POS PREP)) (POS DET)) Y6: ((SPANSTART 2) Y7: ((SPANSTART 0) (SPANEND 4) (SPANEND 4) (LEX $WRH) (LEX B$WRH) (POS N) (POS LEX)) (GEN F) (NUM S) (STATUS ABSOLUTE))

47 Nov 17, 2005Learning-based MT47 Sample Output (dev-data) maxwell anurpung comes from ghana for israel four years ago and since worked in cleaning in hotels in eilat a few weeks ago announced if management club hotel that for him to leave israel according to the government instructions and immigration police in a letter in broken english which spread among the foreign workers thanks to them hotel for their hard work and announced that will purchase for hm flight tickets for their countries from their money

48 Nov 17, 2005Learning-based MT48 Evaluation Results Test set of 62 sentences from Haaretz newspaper, 2 reference translations SystemBLEUNISTPRMETEOR No Gram0.06163.41090.40900.44270.3298 Learned0.07743.54510.41890.44880.3478 Manual0.10263.77890.43340.44740.3617

49 Nov 17, 2005Learning-based MT49 Outline Rationale for learning-based MT Roadmap for learning-based MT Framework overview Elicitation Learning transfer Rules Automatic rule refinement Learning Morphology Example prototypes Implications for MT with vast parallel data Conclusions and future directions

50 Nov 17, 2005Learning-based MT50 Implications for MT with Vast Amounts of Parallel Data Learning word/short-phrase translations vs. learning long phrase-to-phrase translations Phrase-to-phrase MT ill suited for long-range reorderings  ungrammatical output Recent work on hierarchical Stat-MT [Chiang, 2005] and parsing-based MT [Melamed et al, 2005] Learning general tree-to-tree syntactic mappings is equally problematic: –Meaning is a hybrid of complex, non-compositional phrases embedded within a syntactic structure –Some constituents can be translated in isolation, others require contextual mappings

51 Nov 17, 2005Learning-based MT51 Implications for MT with Vast Amounts of Parallel Data Our approach for learning transfer rules is applicable to the large data scenario, subject to solutions for several challenges: –No elicitation corpus  break-down parallel sentences into reasonable learning examples –Working with less reliable automatic word alignments rather than manual alignments –Effective use of reliable parse structures for ONE language (i.e. English) and automatic word alignments in order to decompose the translation of a sentence into several compositional rules. –Effective scoring of resulting very large transfer grammars, and scaled up transfer + decoding

52 Nov 17, 2005Learning-based MT52 Implications for MT with Vast Amounts of Parallel Data Example: 他 经常 与 江泽民 总统 通 电话 He freq with J Zemin Pres via phone He freq talked with President J Zemin over the phone

53 Nov 17, 2005Learning-based MT53 Implications for MT with Vast Amounts of Parallel Data Example: 他 经常 与 江泽民 总统 通 电话 He freq with J Zemin Pres via phone He freq talked with President J Zemin over the phone NP1 NP2 NP3

54 Nov 17, 2005Learning-based MT54 Conclusions There is hope yet for wide-spread MT between many of the worlds language pairs MT offers a fertile yet extremely challenging ground for learning-based approaches that leverage from diverse sources of information: –Syntactic structure of one or both languages –Word-to-word correspondences –Decomposable units of translation –Statistical Language Models Provides a feasible solution to MT for languages with limited resources Extremely promising approach for addressing the fundamental weaknesses in current corpus-based MT for languages with vast resources

55 Nov 17, 2005Learning-based MT55 Future Research Directions Automatic Transfer Rule Learning: –In the “large-data” scenario: from large volumes of uncontrolled parallel text automatically word-aligned –In the absence of morphology or POS annotated lexica –Learning mappings for non-compositional structures –Effective models for rule scoring for Decoding: using scores at runtime Pruning the large collections of learned rules –Learning Unification Constraints Integrated Xfer Engine and Decoder –Improved models for scoring tree-to-tree mappings, integration with LM and other knowledge sources in the course of the search

56 Nov 17, 2005Learning-based MT56 Future Research Directions Automatic Rule Refinement Morphology Learning Feature Detection and Corpus Navigation …

57 Nov 17, 2005Learning-based MT57

58 Nov 17, 2005Learning-based MT58 Mapudungun-to-Spanish Example Mapudungun pelafiñ Maria Spanish No vi a María English I didn’t see Maria

59 Nov 17, 2005Learning-based MT59 Mapudungun-to-Spanish Example Mapudungun pelafiñ Maria pe-la-fi-ñMaria see-neg-3.obj-1.subj.indicativeMaria Spanish No vi a María negsee.1.subj.past.indicativeaccMaria English I didn’t see Maria

60 Nov 17, 2005Learning-based MT60 V pe pe-la-fi-ñ Maria

61 Nov 17, 2005Learning-based MT61 V pe pe-la-fi-ñ Maria VSuff la Negation = +

62 Nov 17, 2005Learning-based MT62 V pe pe-la-fi-ñ Maria VSuff la VSuffG Pass all features up

63 Nov 17, 2005Learning-based MT63 V pe pe-la-fi-ñ Maria VSuff la VSuffG VSuff fi object person = 3

64 Nov 17, 2005Learning-based MT64 V pe pe-la-fi-ñ Maria VSuff la VSuffGVSuff fi VSuffG Pass all features up from both children

65 Nov 17, 2005Learning-based MT65 V pe pe-la-fi-ñ Maria VSuff la VSuffGVSuff fi VSuffGVSuff ñ person = 1 number = sg mood = ind

66 Nov 17, 2005Learning-based MT66 V pe pe-la-fi-ñ Maria VSuff la VSuffGVSuff fi VSuffGVSuff ñ Pass all features up from both children VSuffG

67 Nov 17, 2005Learning-based MT67 V V pe pe-la-fi-ñ Maria VSuff la VSuffGVSuff fi VSuffGVSuff ñ Pass all features up from both children VSuffG Check that: 1) negation = + 2) tense is undefined

68 Nov 17, 2005Learning-based MT68 V pe pe-la-fi-ñ Maria VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG V NP N Maria N person = 3 number = sg human = +

69 Nov 17, 2005Learning-based MT69 Pass features up from V pe pe-la-fi-ñ Maria VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V Check that NP is human = + V VP

70 Nov 17, 2005Learning-based MT70 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S

71 Nov 17, 2005Learning-based MT71 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V Pass all features to Spanish side

72 Nov 17, 2005Learning-based MT72 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V Pass all features down

73 Nov 17, 2005Learning-based MT73 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V Pass object features down

74 Nov 17, 2005Learning-based MT74 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V Accusative marker on objects is introduced because human = +

75 Nov 17, 2005Learning-based MT75 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V VP::VP [VBar NP] -> [VBar "a" NP] ((X1::Y1) (X2::Y3) ((X2 type) = (*NOT* personal)) ((X2 human) =c +) (X0 = X1) ((X0 object) = X2) (Y0 = X0) ((Y0 object) = (X0 object)) (Y1 = Y0) (Y3 = (Y0 object)) ((Y1 objmarker person) = (Y3 person)) ((Y1 objmarker number) = (Y3 number)) ((Y1 objmarker gender) = (Y3 ender)))

76 Nov 17, 2005Learning-based MT76 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V V“no” Pass person, number, and mood features to Spanish Verb Assign tense = past

77 Nov 17, 2005Learning-based MT77 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V V“no” Introduced because negation = +

78 Nov 17, 2005Learning-based MT78 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V V“no” ver

79 Nov 17, 2005Learning-based MT79 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V V“no” ver vi person = 1 number = sg mood = indicative tense = past

80 Nov 17, 2005Learning-based MT80 V pe Transfer to Spanish: Top-Down VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V V“no” vi N María N Pass features over to Spanish side

81 Nov 17, 2005Learning-based MT81 V pe I Didn’t see Maria VSuff la VSuffGVSuff fi VSuffGVSuff ñ VSuffG NP N Maria N S V VP S NP“a” V V“no” vi N María N

82 Nov 17, 2005Learning-based MT82

83 Nov 17, 2005Learning-based MT83 Conclusions Transfer rules (both manual and learned) offer significant contributions that can complement existing data-driven approaches –Also in medium and large data settings? Initial steps to development of a statistically grounded transfer-based MT system with: –Rules that are scored based on a well-founded probability model –Strong and effective decoding that incorporates the most advanced techniques used in SMT decoding Working from the “opposite” end of research on incorporating models of syntax into “standard” SMT systems [Knight et al] Our direction makes sense in the limited data scenario

84 Nov 17, 2005Learning-based MT84 Missing Science Monolingual learning tasks: –Learning morphology: morphemes and their meaning –Learning syntactic and semantic structures: grammar induction Bilingual Learning Tasks: –Automatic acquisition of word and phrase translation lexicons –Learning structural mappings (syntactic, semantic, non-compositional) Models that effectively combine learned symbolic knowledge with statistical information: new “decoders”

85 Nov 17, 2005Learning-based MT85 AVENUE Partners LanguageCountryInstitutions Mapudungun (in place) Chile Universidad de la Frontera, Institute for Indigenous Studies, Ministry of Education Quechua (discussion) Peru Ministry of Education Aymara (discussion) Bolivia, Peru Ministry of Education

86 Nov 17, 2005Learning-based MT86 The Transfer Engine Analysis Source text is parsed into its grammatical structure. Determines transfer application ordering. Example: 他 看 书。 (he read book) S NP VP N V NP 他 看 书 Transfer A target language tree is created by reordering, insertion, and deletion. S NP VP N V NP he read DET N a book Article “a” is inserted into object NP. Source words translated with transfer lexicon. Generation Target language constraints are checked and final translation produced. E.g. “reads” is chosen over “read” to agree with “he”. Final translation: “He reads a book”

87 Nov 17, 2005Learning-based MT87 Seeded VSL: Some Open Issues Three types of constraints: –X-side constrain applicability of rule –Y-side assist in generation –X-Y transfer features from SL to TL Which of the three types improves translation performance? –Use rules without features to populate lattice, decoder will select the best translation… –Learn only X-Y constraints, based on list of universal projecting features Other notions of version-spaces of feature constraints: –Current feature learning is specific to rules that have identical transfer components –Important issue during transfer is to disambiguate among rules that have same SL side but different TL side – can we learn effective constraints for this?

88 Nov 17, 2005Learning-based MT88 Examples of Learned Rules (Hindi-to-English) {NP,14244} ;;Score:0.0429 NP::NP [N] -> [DET N] ( (X1::Y2) ) {NP,14434} ;;Score:0.0040 NP::NP [ADJ CONJ ADJ N] -> [ADJ CONJ ADJ N] ( (X1::Y1) (X2::Y2) (X3::Y3) (X4::Y4) ) {PP,4894} ;;Score:0.0470 PP::PP [NP POSTP] -> [PREP NP] ( (X2::Y1) (X1::Y2) )

89 Nov 17, 2005Learning-based MT89 Transfer Engine English Language Model Transfer Rules {NP1,3} NP1::NP1 [NP1 "H" ADJ] -> [ADJ NP1] ((X3::Y1) (X1::Y2) ((X1 def) = +) ((X1 status) =c absolute) ((X1 num) = (X3 num)) ((X1 gen) = (X3 gen)) (X0 = X1)) Translation Lexicon N::N |: ["$WR"] -> ["BULL"] ((X1::Y1) ((X0 NUM) = s) ((Y0 lex) = "BULL")) N::N |: ["$WRH"] -> ["LINE"] ((X1::Y1) ((X0 NUM) = s) ((Y0 lex) = "LINE")) Hebrew Input בשורה הבאה Decoder English Output in the next line Translation Output Lattice (0 1 "IN" @PREP) (1 1 "THE" @DET) (2 2 "LINE" @N) (1 2 "THE LINE" @NP) (0 2 "IN LINE" @PP) (0 4 "IN THE NEXT LINE" @PP) Preprocessing Morphology

90 Nov 17, 2005Learning-based MT90 Future Directions Continued work on automatic rule learning (especially Seeded Version Space Learning) –Use Hebrew and Hindi systems as test platforms for experimenting with advanced learning research Rule Refinement via interaction with bilingual speakers Developing a well-founded model for assigning scores (probabilities) to transfer rules Redesigning and improving decoder to better fit the specific characteristics of the XFER model Improved leveraging from manual grammar resources MEMT with improved –Combination of output from different translation engines with different confidence scores – strong decoding capabilities

91 Nov 17, 2005Learning-based MT91 Seeded Version Space Learning NP v det nNP VP … 1.Group seed rules into version spaces as above. 2.Make use of partial order of rules in version space. Partial order is defined via the f-structures satisfying the constraints. 3.Generalize in the space by repeated merging of rules: 1.Deletion of constraint 2.Moving value constraints to agreement constraints, e.g. ((x1 num) = *pl), ((x3 num) = *pl)  ((x1 num) = (x3 num) 4. Check translation power of generalized rules against sentence pairs

92 Nov 17, 2005Learning-based MT92 Seeded Version Space Learning: The Search The Seeded Version Space algorithm itself is the repeated generalization of rules by merging A merge is successful if the set of sentences that can correctly be translated with the merged rule is a superset of the union of sets that can be translated with the unmerged rules, i.e. check power of rule Merge until no more successful merges

93 Nov 17, 2005Learning-based MT93 AVENUE Architecture User Learning Module Elicitation Process Transfer Rule Learning Transfer Rules Run-Time Module SL Input SL Parser Transfer Engine TL Generator TL Output Decoder Morphology Pre-proc

94 Nov 17, 2005Learning-based MT94 Learning Transfer-Rules for Languages with Limited Resources Rationale: –Large bilingual corpora not available –Bilingual native informant(s) can translate and align a small pre-designed elicitation corpus, using elicitation tool –Elicitation corpus designed to be typologically comprehensive and compositional –Transfer-rule engine and new learning approach support acquisition of generalized transfer-rules from the data

95 Nov 17, 2005Learning-based MT95 The Transfer Engine Analysis Source text is parsed into its grammatical structure. Determines transfer application ordering. Example: 他 看 书。 (he read book) S NP VP N V NP 他 看 书 Transfer A target language tree is created by reordering, insertion, and deletion. S NP VP N V NP he read DET N a book Article “a” is inserted into object NP. Source words translated with transfer lexicon. Generation Target language constraints are checked and final translation produced. E.g. “reads” is chosen over “read” to agree with “he”. Final translation: “He reads a book”

96 Nov 17, 2005Learning-based MT96 Transfer Rule Formalism Type information Part-of-speech/constituent information Alignments x-side constraints y-side constraints xy-constraints, e.g. ((Y1 AGR) = (X1 AGR)) ; SL: the man, TL: der Mann NP::NP [DET N] -> [DET N] ( (X1::Y1) (X2::Y2) ((X1 AGR) = *3-SING) ((X1 DEF = *DEF) ((X2 AGR) = *3-SING) ((X2 COUNT) = +) ((Y1 AGR) = *3-SING) ((Y1 DEF) = *DEF) ((Y2 AGR) = *3-SING) ((Y2 GENDER) = (Y1 GENDER)) )

97 Nov 17, 2005Learning-based MT97 Transfer Rule Formalism (II) Value constraints Agreement constraints ;SL: the man, TL: der Mann NP::NP [DET N] -> [DET N] ( (X1::Y1) (X2::Y2) ((X1 AGR) = *3-SING) ((X1 DEF = *DEF) ((X2 AGR) = *3-SING) ((X2 COUNT) = +) ((Y1 AGR) = *3-SING) ((Y1 DEF) = *DEF) ((Y2 AGR) = *3-SING) ((Y2 GENDER) = (Y1 GENDER)) )

98 Nov 17, 2005Learning-based MT98 Rule Learning - Overview Goal: Acquire Syntactic Transfer Rules Use available knowledge from the source side (grammatical structure) Three steps: 1.Flat Seed Generation: first guesses at transfer rules; flat syntactic structure 2.Compositionality: use previously learned rules to add hierarchical structure 3.Seeded Version Space Learning: refine rules by generalizing with validation (learn appropriate feature constraints)

99 Nov 17, 2005Learning-based MT99 Examples of Learned Rules (I) {NP,14244} ;;Score:0.0429 NP::NP [N] -> [DET N] ( (X1::Y2) ) {NP,14434} ;;Score:0.0040 NP::NP [ADJ CONJ ADJ N] -> [ADJ CONJ ADJ N] ( (X1::Y1) (X2::Y2) (X3::Y3) (X4::Y4) ) {PP,4894} ;;Score:0.0470 PP::PP [NP POSTP] -> [PREP NP] ( (X2::Y1) (X1::Y2) )

100 Nov 17, 2005Learning-based MT100 A Limited Data Scenario for Hindi-to-English Put together a scenario with “miserly” data resources: –Elicited Data corpus: 17589 phrases –Cleaned portion (top 12%) of LDC dictionary: ~2725 Hindi words (23612 translation pairs) –Manually acquired resources during the SLE: 500 manual bigram translations 72 manually written phrase transfer rules 105 manually written postposition rules 48 manually written time expression rules No additional parallel text!!

101 Nov 17, 2005Learning-based MT101 Manual Grammar Development Covers mostly NPs, PPs and VPs (verb complexes) ~70 grammar rules, covering basic and recursive NPs and PPs, verb complexes of main tenses in Hindi (developed in two weeks)

102 Nov 17, 2005Learning-based MT102 Manual Transfer Rules: Example ;; PASSIVE OF SIMPLE PAST (NO AUX) WITH LIGHT VERB ;; passive of 43 (7b) {VP,28} VP::VP : [V V V] -> [Aux V] ( (X1::Y2) ((x1 form) = root) ((x2 type) =c light) ((x2 form) = part) ((x2 aspect) = perf) ((x3 lexwx) = 'jAnA') ((x3 form) = part) ((x3 aspect) = perf) (x0 = x1) ((y1 lex) = be) ((y1 tense) = past) ((y1 agr num) = (x3 agr num)) ((y1 agr pers) = (x3 agr pers)) ((y2 form) = part) )

103 Nov 17, 2005Learning-based MT103 Manual Transfer Rules: Example ; NP1 ke NP2 -> NP2 of NP1 ; Ex: jIvana ke eka aXyAya ; life of (one) chapter ; ==> a chapter of life ; {NP,12} NP::NP : [PP NP1] -> [NP1 PP] ( (X1::Y2) (X2::Y1) ; ((x2 lexwx) = 'kA') ) {NP,13} NP::NP : [NP1] -> [NP1] ( (X1::Y1) ) {PP,12} PP::PP : [NP Postp] -> [Prep NP] ( (X1::Y2) (X2::Y1) ) NP PP NP1 NP P Adj N N1 ke eka aXyAya N jIvana NP NP1 PP Adj N P NP one chapter of N1 N life

104 Nov 17, 2005Learning-based MT104 Adding a “Strong” Decoder XFER system produces a full lattice Edges are scored using word-to-word translation probabilities, trained from the limited bilingual data Decoder uses an English LM (70m words) Decoder can also reorder words or phrases (up to 4 positions ahead) For XFER (strong), ONLY edges from basic XFER system are used!

105 Nov 17, 2005Learning-based MT105 Testing Conditions Tested on section of JHU provided data: 258 sentences with four reference translations –SMT system (stand-alone) –EBMT system (stand-alone) –XFER system (naïve decoding) –XFER system with “strong” decoder No grammar rules (baseline) Manually developed grammar rules Automatically learned grammar rules –XFER+SMT with strong decoder (MEMT)

106 Nov 17, 2005Learning-based MT106 Results on JHU Test Set (very miserly training data) SystemBLEUM-BLEUNIST EBMT0.0580.1654.22 SMT0.0930.1914.64 XFER (naïve) man grammar 0.0550.1774.46 XFER (strong) no grammar 0.1090.2245.29 XFER (strong) learned grammar 0.1160.2315.37 XFER (strong) man grammar 0.1350.2435.59 XFER+SMT0.1360.2435.65

107 Nov 17, 2005Learning-based MT107 Effect of Reordering in the Decoder

108 Nov 17, 2005Learning-based MT108 Observations and Lessons (I) XFER with strong decoder outperformed SMT even without any grammar rules in the miserly data scenario –SMT Trained on elicited phrases that are very short –SMT has insufficient data to train more discriminative translation probabilities –XFER takes advantage of Morphology Token coverage without morphology: 0.6989 Token coverage with morphology: 0.7892 Manual grammar currently somewhat better than automatically learned grammar –Learned rules did not yet use version-space learning –Large room for improvement on learning rules –Importance of effective well-founded scoring of learned rules

109 Nov 17, 2005Learning-based MT109 Observations and Lessons (II) MEMT (XFER and SMT) based on strong decoder produced best results in the miserly scenario. Reordering within the decoder provided very significant score improvements –Much room for more sophisticated grammar rules –Strong decoder can carry some of the reordering “burden”

110 Nov 17, 2005Learning-based MT110 Conclusions Transfer rules (both manual and learned) offer significant contributions that can complement existing data-driven approaches –Also in medium and large data settings? Initial steps to development of a statistically grounded transfer-based MT system with: –Rules that are scored based on a well-founded probability model –Strong and effective decoding that incorporates the most advanced techniques used in SMT decoding Working from the “opposite” end of research on incorporating models of syntax into “standard” SMT systems [Knight et al] Our direction makes sense in the limited data scenario

111 Nov 17, 2005Learning-based MT111 Future Directions Continued work on automatic rule learning (especially Seeded Version Space Learning) Improved leveraging from manual grammar resources, interaction with bilingual speakers Developing a well-founded model for assigning scores (probabilities) to transfer rules Improving the strong decoder to better fit the specific characteristics of the XFER model MEMT with improved –Combination of output from different translation engines with different scorings – strong decoding capabilities

112 Nov 17, 2005Learning-based MT112 Rule Learning - Overview Goal: Acquire Syntactic Transfer Rules Use available knowledge from the source side (grammatical structure) Three steps: 1.Flat Seed Generation: first guesses at transfer rules; no syntactic structure 2.Compositionality: use previously learned rules to add structure 3.Seeded Version Space Learning: refine rules by generalizing with validation

113 Nov 17, 2005Learning-based MT113 Flat Seed Generation Create a transfer rule that is specific to the sentence pair, but abstracted to the POS level. No syntactic structure. ElementSource SL POS sequencef-structure TL POS sequenceTL dictionary, aligned SL words Type informationcorpus, same on SL and TL Alignmentsinformant x-side constraintsf-structure y-side constraintsTL dictionary, aligned SL words (list of projecting features)

114 Nov 17, 2005Learning-based MT114 Flat Seed Generation - Example The highly qualified applicant did not accept the offer. Der äußerst qualifizierte Bewerber nahm das Angebot nicht an. ((1,1),(2,2),(3,3),(4,4),(6,8),(7,5),(7,9),(8,6),(9,7)) S::S [det adv adj n aux neg v det n] -> [det adv adj n v det n neg vpart] (;;alignments: (x1:y1)(x2::y2)(x3::y3)(x4::y4)(x6::y8)(x7::y5)(x7::y9)(x8::y6)(x9::y7)) ;;constraints: ((x1 def) = *+) ((x4 agr) = *3-sing) ((x5 tense) = *past) …. ((y1 def) = *+) ((y3 case) = *nom) ((y4 agr) = *3-sing) …. )

115 Nov 17, 2005Learning-based MT115 Compositionality - Overview Traverse the c-structure of the English sentence, add compositional structure for translatable chunks Adjust constituent sequences, alignments Remove unnecessary constraints, i.e. those that are contained in the lower-level rule Adjust constraints: use f-structure of correct translation vs. f-structure of incorrect translations to introduce context constraints

116 Nov 17, 2005Learning-based MT116 Compositionality - Example S::S [det adv adj n aux neg v det n] -> [det adv adj n v det n neg vpart] (;;alignments: (x1:y1)(x2::y2)(x3::y3)(x4::y4)(x6::y8)(x7::y5)(x7::y9)(x8::y6)(x9::y7)) ;;constraints: ((x1 def) = *+) ((x4 agr) = *3-sing) ((x5 tense) = *past) …. ((y1 def) = *+) ((y3 case) = *nom) ((y4 agr) = *3-sing) …. ) S::S [NP aux neg v det n] -> [NP v det n neg vpart] (;;alignments: (x1::y1)(x3::y5)(x4::y2)(x4::y6)(x5::y3)(x6::y4) ;;constraints: ((x2 tense) = *past) …. ((y1 def) = *+) ((y1 case) = *nom) …. ) NP::NP [det AJDP n] -> [det ADJP n] ((x1::y1)… ((y3 agr) = *3-sing) ((x3 agr = *3-sing) ….)

117 Nov 17, 2005Learning-based MT117 Seeded Version Space Learning: Overview Goal: further generalize the acquired rules Methodology: –Preserve general structural transfer –Consider relaxing specific feature constraints Seed rules are grouped into clusters of similar transfer structure (type, constituent sequences, alignments) Each cluster forms a version space: a partially ordered hypothesis space with a specific and a general boundary The seed rules in a group form the specific boundary of a version space The general boundary is the (implicit) transfer rule with the same type, constituent sequences, and alignments, but no feature constraints

118 Nov 17, 2005Learning-based MT118 Seeded Version Space Learning NP v det nNP VP … 1.Group seed rules into version spaces as above. 2.Make use of partial order of rules in version space. Partial order is defined via the f-structures satisfying the constraints. 3.Generalize in the space by repeated merging of rules: 1.Deletion of constraint 2.Moving value constraints to agreement constraints, e.g. ((x1 num) = *pl), ((x3 num) = *pl)  ((x1 num) = (x3 num) 4. Check translation power of generalized rules against sentence pairs

119 Nov 17, 2005Learning-based MT119 Seeded Version Space Learning: Example S::S [NP aux neg v det n] -> [NP v det n neg vpart] (;;alignments: (x1::y1)(x3::y5)(x4::y2)(x4::y6)(x5::y3)(x6::y4) ;;constraints: ((x2 tense) = *past) …. ((y1 def) = *+) ((y1 case) = *nom) ((y1 agr) = *3-sing) … ) ((y3 agr) = *3-sing) ((y4 agr) = *3-sing)… ) S::S [NP aux neg v det n] -> [NP v det n neg vpart] (;;alignments: (x1::y1)(x3::y5)(x4::y2)(x4::y6)(x5::y3)(x6::y4) ;;constraints: ((x2 tense) = *past) … ((y1 def) = *+) ((y1 case) = *nom) ((y1 agr) = *3-plu) … ((y3 agr) = *3-plu) ((y4 agr) = *3-plu)… ) S::S [NP aux neg v det n] -> [NP n det n neg vpart] ( ;;alignments: (x1::y1)(x3::y5) (x4::y2)(x4::y6) (x5::y3)(x6::y4) ;;constraints: ((x2 tense) = *past) … ((y1 def) = *+) ((y1 case) = *nom) ((y4 agr) = (y3 agr)) … )

120 Nov 17, 2005Learning-based MT120 Preliminary Evaluation English to German Corpus of 141 ADJPs, simple NPs and sentences 10-fold cross-validation experiment Goals: –Do we learn useful transfer rules? –Does Compositionality improve generalization? –Does VS-learning improve generalization?

121 Nov 17, 2005Learning-based MT121 Summary of Results Average translation accuracy on cross- validation test set was 62% Without VS-learning: 43% Without Compositionality: 57% Average number of VSs: 24 Average number of sents per VS: 3.8 Average number of merges per VS: 1.6 Percent of compositional rules: 34%

122 Nov 17, 2005Learning-based MT122 Conclusions New paradigm for learning transfer rules from pre-designed elicitation corpus Geared toward languages with very limited resources Preliminary experiments validate approach: compositionality and VS- learning improve generalization

123 Nov 17, 2005Learning-based MT123 Future Work 1.Larger, more diverse elicitation corpus 2.Additional languages (Mapudungun…) 3.Less information on TL side 4.Reverse translation direction 5.Refine the various algorithms: Operators for VS generalization Generalization VS search Layers for compositionality 6.User interactive verification

124 Nov 17, 2005Learning-based MT124 Seeded Version Space Learning: Generalization The partial order of the version space: Definition: A transfer rule tr 1 is strictly more general than another transfer rule tr 2 if all f- structures that are satisfied by tr 2 are also satisfied by tr 1. Generalize rules by merging them: –Deletion of constraint –Raising two value constraints to an agreement constraint, e.g. ((x1 num) = *pl), ((x3 num) = *pl)  ((x1 num) = (x3 num))

125 Nov 17, 2005Learning-based MT125 Seeded Version Space Learning: Merging Two Rules Merging algorithm proceeds in three steps. To merge tr 1 and tr 2 into tr merged : 1.Copy all constraints that are both in tr 1 and tr 2 into tr merged 2.Consider tr 1 and tr 2 separately. For the remaining constraints in tr 1 and tr 2, perform all possible instances of raising value constraints to agreement constraints. 3.Repeat step 1.

126 Nov 17, 2005Learning-based MT126 Seeded Version Space Learning: The Search The Seeded Version Space algorithm itself is the repeated generalization of rules by merging A merge is successful if the set of sentences that can correctly be translated with the merged rule is a superset of the union of sets that can be translated with the unmerged rules, i.e. check power of rule Merge until no more successful merges

127 Nov 17, 2005Learning-based MT127 Constructing a Network of Candidate Pattern Sets (An Example) Example Vocabulary blame blamed blames roamed roaming roams solve solves solving

128 Nov 17, 2005Learning-based MT128 Ø.s blame solve Example Vocabulary blame blamed blames roamed roaming roams solve solves solving

129 Nov 17, 2005Learning-based MT129 Ø.s blame solve Ø.s.d blame Example Vocabulary blame blamed blames roamed roaming roams solve solves solving

130 Nov 17, 2005Learning-based MT130 Ø.s blame solve Ø.s.d blame Example Vocabulary blame blamed blames roamed roaming roams solve solves solving

131 Nov 17, 2005Learning-based MT131 Ø.s blame solve Ø.s.d blame s blame roam solve Example Vocabulary blame blamed blames roamed roaming roams solve solves solving

132 Nov 17, 2005Learning-based MT132 Ø.s blame solve Ø.s.d blame s blame roam solve Example Vocabulary blame blamed blames roamed roaming roams solve solves solving

133 Nov 17, 2005Learning-based MT133 Ø.s blame solve Ø.s.d blame s blame roam solve e.es blam solv Example Vocabulary blame blamed blames roamed roaming roams solve solves solving

134 Nov 17, 2005Learning-based MT134 Ø.s blame solve Example Vocabulary blame blamed blames roamed roaming roams solve solves solving Ø.s.d blame s blame roam solve e.es blam solv

135 Nov 17, 2005Learning-based MT135 Ø.s blame solve Example Vocabulary blame blamed blames roamed roaming roams solve solves solving Ø.s.d blame s blame roam solve e.es blam solv me.mes bla

136 Nov 17, 2005Learning-based MT136 Add Test to the Generate Finite state hub searching algorithm (Johnson and Martin, 2003) can weed out unlikely morpheme boundaries to speed up network generation m t t ing s Ø m t t z r e o s t a y e a a ngi Ø ries t.ting.ts res retrea Ø.ing.s rest retreat roam t.ting res retrea Ø.ing rest retreat retry roam 136

137 Nov 17, 2005Learning-based MT137 as 404 huelg, huelguist, incluid, industri,... a.as.o.os 43 african, cas, jurídic, l,... a.as.o.os.tro 1 cas a.as.os 50 afectad, cas, jurídic, l,... a.as.o 59 cas, citad, jurídic, l,... a.o.os 105 impuest, indonesi, italian, jurídic,... a.as 199 huelg, incluid, industri, inundad,... a.os 134 impedid, impuest, indonesi, inundad,... as.os 68 cas, implicad, inundad, jurídic,... a.o 214 id, indi, indonesi, inmediat,... as.o 85 intern, jurídic, just, l,... a.tro 2 cas.cen a 1237 huelg, ib, id, iglesi,... os 534 humorístic, human, hígad, impedid,... o 1139 hub, hug, human, huyend,... tro 16 catas, ce, cen, cua,... as.o.os 54 cas, implicad, jurídic, l,... o.os 268 human, implicad, indici, indocumentad,... Each c-suffix is a random variable with a value equal to the count of the c-stems that occur with that suffix Use Χ 2 Test: Reject hypothesis: a ┴ as (p-value << 0.005) Accept hypothesis: a ┴ tro (p-value = 0.2) 137

138 Nov 17, 2005Learning-based MT138 Weight c-stems by: Length, Length of longest c-suffix that attaches Frequency Currently each c-stem is implicitly weighted equal a.as.o.os.tro 1 cas a.tro 2 cas.cen tro 16 catas, ce, cen, cua,... a.as.o.os 43 african, cas, jurídic, l,... a.as.os 50 afectad, cas, jurídic, l,... a.as.o 59 cas, citad, jurídic, l,... a.o.os 105 impuest, indonesi, italian, jurídic,... a.as 199 huelg, incluid, industri, inundad,... a.os 134 impedid, impuest, indonesi, inundad,... as.os 68 cas, implicad, inundad, jurídic,... a.o 214 id, indi, indonesi, inmediat,... as.o 85 intern, jurídic, just, l,... a 1237 huelg, ib, id, iglesi,... as 404 huelg, huelguist, incluid, industri,... os 534 humorístic, human, hígad, impedid,... o 1139 hub, hug, human, huyend,... as.o.os 54 cas, implicad, jurídic, l,... o.os 268 human, implicad, indici, indocumentad,... 138

139 Nov 17, 2005Learning-based MT139 Some schemes absent from this network (i.e. a.os.tro) Sub-network density: Every descendent of a.as.o.os is in the network—Not true for a.as.o.os.tro a.as.o.os.tro 1 cas a.tro 2 cas.cen tro 16 catas, ce, cen, cua,... a.as.o.os 43 african, cas, jurídic, l,... a.as.os 50 afectad, cas, jurídic, l,... a.as.o 59 cas, citad, jurídic, l,... a.o.os 105 impuest, indonesi, italian, jurídic,... a.as 199 huelg, incluid, industri, inundad,... a.os 134 impedid, impuest, indonesi, inundad,... as.os 68 cas, implicad, inundad, jurídic,... a.o 214 id, indi, indonesi, inmediat,... as.o 85 intern, jurídic, just, l,... a 1237 huelg, ib, id, iglesi,... as 404 huelg, huelguist, incluid, industri,... os 534 humorístic, human, hígad, impedid,... o 1139 hub, hug, human, huyend,... as.o.os 54 cas, implicad, jurídic, l,... o.os 268 human, implicad, indici, indocumentad,... 139

140 Nov 17, 2005Learning-based MT140 Word-to-Morpheme Segmentation De facto standard measure for unsupervised morphology induction Prerequisite for many NLP tasks –Machine Translation –Speech Recognition of highly inflecting languages 140

141 Nov 17, 2005Learning-based MT141 S NP VP VDetN Thetreesfell Loscayeronárboles S NP VP VDetN Thetreefell Elcayóárbol Subject number marked on: N-head ( es ) dependent Det ( El vs. Los ), and governing V ( ó vs eron ) ((TENSE past) (LEXICAL-ASPECT activity)... (SUBJ ((NUM sg) (PERSON 3sg)...))) ((TENSE past) (LEXICAL-ASPECT activity)... (SUBJ ((NUM pl) (PERSON 3sg)...)))

142 Nov 17, 2005Learning-based MT142 Ø.ed.ly 11 clear direct present quiet … Ø.ed.ing.ly 6 clear open present Total ed.ly 12 bodi clear correct quiet … Ø.ed.ing.ly.s 4 clear open … Ø.ed.ing 201 aid clear defend deliver … d.ded.ding 27 ai boar defen … d.ded.ding.ds 19 ad boar defen … Ø.ed.ing.s 106 clear defend open present … Morphology Learning AVENUE Approach: Organize the raw data in the form of a network of paradigm candidate schemes Search the network for a collection of schemes that represent true morphology paradigms of the language Learn mappings between the schemes and features/functions using minimal pairs of elicited data Construct analyzer based on the collection of schemes and the acquired function mappings

143 Nov 17, 2005Learning-based MT143 a.as.o.os 43 african cas jurídic l... a.as.i.o.os.sandra.tanier.ter.tro.trol 1 cas a.as.os 50 afectad cas jurídic l... a.as.o 59 cas citad jurídic l... a.o.os 105 impuest indonesi italian jurídic... a.as 199 huelg incluid industri inundad... a.os 134 impedid impuest indonesi inundad... as.os 68 cas implicad inundad jurídic... a.o 214 id indi indonesi inmediat... as.o 85 intern jurídic just l... a.tro 2 cas cen a 1237 huelg ib id iglesi... as 404 huelg huelguist incluid industri... os 534 humorístic human hígad impedid... o 1139 hub hug human huyend... tro 16 catas ce cen cua... as.o.os 54 cas implicad jurídic l... Figure : Hierarchical scheme lattice automatically derived from a Spanish newswire corpus of 40,011 words and 6,975 unique types. o.os 268 human implicad indici indocumentad...


Download ppt "Nov 17, 2005Learning-based MT1 Learning-based MT Approaches for Languages with Limited Resources Alon Lavie Language Technologies Institute Carnegie Mellon."

Similar presentations


Ads by Google