Presentation is loading. Please wait.

Presentation is loading. Please wait.

Faculty: Alon Lavie, Jaime Carbonell, Lori Levin, Ralf Brown Students:

Similar presentations


Presentation on theme: "Faculty: Alon Lavie, Jaime Carbonell, Lori Levin, Ralf Brown Students:"— Presentation transcript:

1 The AVENUE Project: Bootstrapping MT Prototypes for Languages with Limited Resources
Faculty: Alon Lavie, Jaime Carbonell, Lori Levin, Ralf Brown Students: Erik Peterson, Christian Monson, Ariadna Font-Llitjos, Alison Alvarez

2 Progression of MT Started with rule-based systems
Very large expert human effort to construct language-specific resources (grammars, lexicons) High-quality MT extremely expensive  only for handful of language pairs Along came EBMT and then SMT… Replaced human effort with extremely large volumes of parallel text data Less expensive, but still only feasible for a small number of language pairs We “traded” human labor with data Where does this take us in 5-10 years? Large parallel corpora for maybe language pairs What about all the other languages? Is all this data (with very shallow representation of language structure) really necessary? Can we build MT approaches that learn deeper levels of language structure and how they map from one language to another? September 7, 2005 AVENUE Project

3 Why Machine Translation for Languages with Limited Resources?
We are in the age of information explosion The internet+web+Google  anyone can get the information they want anytime… But what about the text in all those other languages? How do they read all this English stuff? How do we read all the stuff that they put online? MT for these languages would Enable: Better government access to native indigenous and minority communities Better minority and native community participation in information-rich activities (health care, education, government) without giving up their languages. Civilian and military applications (disaster relief) Language preservation September 7, 2005 AVENUE Project

4 The Roadmap to Learning-based MT
Automatic acquisition of necessary language resources and knowledge using machine learning methodologies: Learning morphology (analysis/generation) Rapid acquisition of broad coverage word-to-word and phrase-to-phrase translation lexicons Learning of syntactic structural mappings Tree-to-tree structure transformations [Knight et al], [Eisner], [Melamed] require parse trees for both languages Learning syntactic transfer rules with resources (grammar, parses) for just one of the two languages Automatic rule refinement and/or post-editing Effective integration of acquired knowledge with statistical/distributional information September 7, 2005 AVENUE Project

5 CMU’s AVENUE Approach Elicitation: use bilingual native informants to produce a small high-quality word-aligned bilingual corpus of translated phrases and sentences Building Elicitation corpora from feature structures Feature Detection and Navigation Transfer-rule Learning: apply ML-based methods to automatically acquire syntactic transfer rules for translation between the two languages Learn from major language to minor language Translate from minor language to major language XFER + Decoder: XFER engine produces a lattice of all possible transferred structures at all levels Decoder searches and selects the best scoring combination Rule Refinement: refine the acquired rules via a process of interaction with bilingual informants Morphology Learning Word and Phrase bilingual lexicon acquisition September 7, 2005 AVENUE Project

6 AVENUE Architecture Learning Module Transfer Rules Translation Lexicon
{PP,4894} ;;Score: PP::PP [NP POSTP] -> [PREP NP] ((X2::Y1) (X1::Y2)) Translation Lexicon Run Time Transfer System Lattice Decoder English Language Model Word-to-Word Translation Probabilities Word-aligned elicited data September 7, 2005 AVENUE Project

7 Learning Transfer-Rules for Languages with Limited Resources
Rationale: Bilingual native informant(s) can translate and align a small pre-designed elicitation corpus, using elicitation tool Elicitation corpus designed to be typologically and structurally comprehensive and compositional Transfer-rule engine and new learning approach support acquisition of generalized transfer-rules from the data September 7, 2005 AVENUE Project

8 Transfer Rule Formalism
;SL: the old man, TL: ha-ish ha-zaqen NP::NP [DET ADJ N] -> [DET N DET ADJ] ( (X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2) ((X1 AGR) = *3-SING) ((X1 DEF = *DEF) ((X3 AGR) = *3-SING) ((X3 COUNT) = +) ((Y1 DEF) = *DEF) ((Y3 DEF) = *DEF) ((Y2 AGR) = *3-SING) ((Y2 GENDER) = (Y4 GENDER)) ) Type information Part-of-speech/constituent information Alignments x-side constraints y-side constraints xy-constraints, e.g. ((Y1 AGR) = (X1 AGR)) September 7, 2005 AVENUE Project

9 Rule Learning - Overview
Goal: Acquire Syntactic Transfer Rules Use available knowledge from the source side (grammatical structure) Three steps: Flat Seed Generation: first guesses at transfer rules; flat syntactic structure Compositionality: use previously learned rules to add hierarchical structure Constraint Learning: refine rules by learning appropriate feature constraints September 7, 2005 AVENUE Project

10 Flat Seed Rule Generation
Learning Example: NP Eng: the big apple Heb: ha-tapuax ha-gadol Generated Seed Rule: NP::NP [ART ADJ N]  [ART N ART ADJ] ((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2)) September 7, 2005 AVENUE Project

11 Compositionality Initial Flat Rules:
S::S [ART ADJ N V ART N]  [ART N ART ADJ V P ART N] ((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2) (X4::Y5) (X5::Y7) (X6::Y8)) NP::NP [ART ADJ N]  [ART N ART ADJ] ((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2)) NP::NP [ART N]  [ART N] ((X1::Y1) (X2::Y2)) Generated Compositional Rule: S::S [NP V NP]  [NP V P NP] ((X1::Y1) (X2::Y2) (X3::Y4)) September 7, 2005 AVENUE Project

12 Constraint Learning Input: Rules and their Example Sets
S::S [NP V NP]  [NP V P NP] {ex1,ex12,ex17,ex26} ((X1::Y1) (X2::Y2) (X3::Y4)) NP::NP [ART ADJ N]  [ART N ART ADJ] {ex2,ex3,ex13} ((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2)) NP::NP [ART N]  [ART N] {ex4,ex5,ex6,ex8,ex10,ex11} ((X1::Y1) (X2::Y2)) Output: Rules with Feature Constraints: S::S [NP V NP]  [NP V P NP] ((X1::Y1) (X2::Y2) (X3::Y4) (X1 NUM = X2 NUM) (Y1 NUM = Y2 NUM) (X1 NUM = Y1 NUM)) September 7, 2005 AVENUE Project

13 AVENUE Prototypes General XFER framework under development for past three years Prototype systems so far: German-to-English, Spanish-to-English Hindi-to-English, Hebrew-to-English In progress or planned: Mapudungun-to-Spanish Quechua-to-Spanish Arabic-to-English Native-Brazilian languages to Brazilian Portuguese September 7, 2005 AVENUE Project

14 Morphology Learning Unsupervised learning of morphemes and their function from raw monolingual data Segmentation of words into morphemes Identification of morphological paradigms (inflections and derivations) Learning association between morphemes and their function in the language September 7, 2005 AVENUE Project

15 Morphology Learning AVENUE Approach: Organize the raw data in the
form of a network of paradigm candidate schemes Search the network for a collection of schemes that represent true morphology paradigms of the language Learn mappings between the schemes and features/functions using minimal pairs of elicited data Construct analyzer based on the collection of schemes and the acquired function mappings Ø.ed.ly 11 clear direct present quiet Ø.ed.ing.ly 6 open Total ed.ly 12 bodi correct Ø.ed.ing.ly.s 4 Ø.ed.ing 201 aid defend deliver d.ded.ding 27 ai boar defen d.ded.ding.ds 19 ad Ø.ed.ing.s 106 September 7, 2005 AVENUE Project

16 a.as.o.os 43 african cas jurídic l ... a.as.i.o.os.sandra.tanier.ter.tro.trol 1 a.as.os 50 afectad a.as.o 59 citad a.o.os 105 impuest indonesi italian a.as 199 huelg incluid industri inundad a.os 134 impedid as.os 68 implicad a.o 214 id indi inmediat as.o 85 intern just a.tro 2 cen a 1237 ib iglesi as 404 huelguist os 534 humorístic human hígad o 1139 hub hug huyend tro 16 catas ce cua as.o.os 54 Figure : Hierarchical scheme lattice automatically derived from a Spanish newswire corpus of 40,011 words and 6,975 unique types. o.os 268 indici indocumentad September 7, 2005 AVENUE Project

17 Automated Rule Refinement
Rationale: Bilingual informants can identify translation errors and pinpoint the errors A sophisticated trace of the translation path can identify likely sources for the error and do “Blame Assignment” Rule Refinement operators can be developed to modify the underlying translation grammar (and lexicon) based on characteristics of the error source: Add or delete feature constraints from a rule Bifurcate a rule into two rules (general and specific) Add or correct lexical entries September 7, 2005 AVENUE Project

18 New Research Directions
Automatic Transfer Rule Learning: In the “large-data” scenario: from large volumes of uncontrolled parallel text automatically word-aligned In the absence of morphology or POS annotated lexica Learning mappings for non-compositional structures Effective models for rule scoring for Decoding: using scores at runtime Pruning the large collections of learned rules Learning Unification Constraints – VSL Integrated Xfer Engine and Decoder Improved models for scoring tree-to-tree mappings, integration with LM and other knowledge sources in the course of the search September 7, 2005 AVENUE Project

19 Missing Science Monolingual learning tasks: Bilingual Learning Tasks:
Learning morphology: morphemes and their meaning Learning syntactic and semantic structures: grammar induction Bilingual Learning Tasks: Automatic acquisition of word and phrase translation lexicons Learning structural mappings (syntactic, semantic, non-compositional) Models that effectively combine learned symbolic knowledge with statistical information: new “decoders” September 7, 2005 AVENUE Project

20 September 7, 2005 AVENUE Project

21 English-Chinese Example
September 7, 2005 AVENUE Project

22 English-Hindi Example
September 7, 2005 AVENUE Project

23 Spanish-Mapudungun Example
September 7, 2005 AVENUE Project

24 English-Arabic Example
September 7, 2005 AVENUE Project

25 Transfer Rule Formalism (II)
;SL: the old man, TL: ha-ish ha-zaqen NP::NP [DET ADJ N] -> [DET N DET ADJ] ( (X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2) ((X1 AGR) = *3-SING) ((X1 DEF = *DEF) ((X3 AGR) = *3-SING) ((X3 COUNT) = +) ((Y1 DEF) = *DEF) ((Y3 DEF) = *DEF) ((Y2 AGR) = *3-SING) ((Y2 GENDER) = (Y4 GENDER)) ) Value constraints Agreement constraints September 7, 2005 AVENUE Project

26 AVENUE Partners Language Country Institutions Mapudungun (in place)
Chile Universidad de la Frontera, Institute for Indigenous Studies, Ministry of Education Quechua (discussion) Peru Aymara Bolivia, Peru September 7, 2005 AVENUE Project

27 The Transfer Engine Analysis Transfer Generation
Source text is parsed into its grammatical structure. Determines transfer application ordering. Example: 他 看 书。(he read book) S NP VP N V NP 他 看 书 Transfer A target language tree is created by reordering, insertion, and deletion. he read DET N a book Article “a” is inserted into object NP. Source words translated with transfer lexicon. Generation Target language constraints are checked and final translation produced. E.g. “reads” is chosen over “read” to agree with “he”. Final translation: “He reads a book” September 7, 2005 AVENUE Project

28 Seeded VSL: Some Open Issues
Three types of constraints: X-side constrain applicability of rule Y-side assist in generation X-Y transfer features from SL to TL Which of the three types improves translation performance? Use rules without features to populate lattice, decoder will select the best translation… Learn only X-Y constraints, based on list of universal projecting features Other notions of version-spaces of feature constraints: Current feature learning is specific to rules that have identical transfer components Important issue during transfer is to disambiguate among rules that have same SL side but different TL side – can we learn effective constraints for this? September 7, 2005 AVENUE Project

29 Examples of Learned Rules (Hindi-to-English)
{NP,14244} ;;Score:0.0429 NP::NP [N] -> [DET N] ( (X1::Y2) ) {NP,14434} ;;Score:0.0040 NP::NP [ADJ CONJ ADJ N] -> [ADJ CONJ ADJ N] (X1::Y1) (X2::Y2) (X3::Y3) (X4::Y4) {PP,4894} ;;Score: PP::PP [NP POSTP] -> [PREP NP] ( (X2::Y1) (X1::Y2) ) September 7, 2005 AVENUE Project

30 XFER MT for Hebrew-to-English
Two month intensive effort to apply our XFER approach to the development of a Hebrew-to-English MT system Challenges: No large parallel corpus Limited coverage translation lexicon Rich Morphology: incomplete analyzer available Accomplished: Collected available resources, establish methodology for processing Hebrew input Translated and aligned Elicitation Corpus Learned XFER rules Developed (small) manual XFER grammar as a point of comparison System debugging and development Evaluated performance on unseen test data using automatic evaluation metrics September 7, 2005 AVENUE Project

31 Transfer Rules Transfer Engine Decoder Hebrew Input
בשורה הבאה Transfer Rules {NP1,3} NP1::NP1 [NP1 "H" ADJ] -> [ADJ NP1] ((X3::Y1) (X1::Y2) ((X1 def) = +) ((X1 status) =c absolute) ((X1 num) = (X3 num)) ((X1 gen) = (X3 gen)) (X0 = X1)) Preprocessing Morphology English Language Model Transfer Engine Translation Lexicon N::N |: ["$WR"] -> ["BULL"] ((X1::Y1) ((X0 NUM) = s) ((Y0 lex) = "BULL")) N::N |: ["$WRH"] -> ["LINE"] ((Y0 lex) = "LINE")) Decoder Translation Output Lattice (0 1 (1 1 (2 2 (1 2 "THE (0 2 "IN (0 4 "IN THE NEXT English Output in the next line September 7, 2005 AVENUE Project

32 Morphology Example Input word: B$WRH 0 1 2 3 4 |--------B$WRH--------|
| B$WRH | |-----B-----|$WR|--H--| |--B--|-H--|--$WRH---| September 7, 2005 AVENUE Project

33 Morphology Example September 7, 2005 AVENUE Project
Y0: ((SPANSTART 0) Y1: ((SPANSTART 0) Y2: ((SPANSTART 1) (SPANEND 4) (SPANEND 2) (SPANEND 3) (LEX B$WRH) (LEX B) (LEX $WR) (POS N) (POS PREP)) (POS N) (GEN F) (GEN M) (NUM S) (NUM S) (STATUS ABSOLUTE)) (STATUS ABSOLUTE)) Y3: ((SPANSTART 3) Y4: ((SPANSTART 0) Y5: ((SPANSTART 1) (SPANEND 4) (SPANEND 1) (SPANEND 2) (LEX $LH) (LEX B) (LEX H) (POS POSS)) (POS PREP)) (POS DET)) Y6: ((SPANSTART 2) Y7: ((SPANSTART 0) (SPANEND 4) (SPANEND 4) (LEX $WRH) (LEX B$WRH) (POS N) (POS LEX)) (GEN F) (NUM S) (STATUS ABSOLUTE)) September 7, 2005 AVENUE Project

34 Sample Output (dev-data)
maxwell anurpung comes from ghana for israel four years ago and since worked in cleaning in hotels in eilat a few weeks ago announced if management club hotel that for him to leave israel according to the government instructions and immigration police in a letter in broken english which spread among the foreign workers thanks to them hotel for their hard work and announced that will purchase for hm flight tickets for their countries from their money September 7, 2005 AVENUE Project

35 Evaluation Results Test set of 62 sentences from Haaretz newspaper, 2 reference translations System BLEU NIST P R METEOR No Gram 0.0616 3.4109 0.4090 0.4427 0.3298 Learned 0.0774 3.5451 0.4189 0.4488 0.3478 Manual 0.1026 3.7789 0.4334 0.4474 0.3617 September 7, 2005 AVENUE Project

36 Future Directions Continued work on automatic rule learning (especially Seeded Version Space Learning) Use Hebrew and Hindi systems as test platforms for experimenting with advanced learning research Rule Refinement via interaction with bilingual speakers Developing a well-founded model for assigning scores (probabilities) to transfer rules Redesigning and improving decoder to better fit the specific characteristics of the XFER model Improved leveraging from manual grammar resources MEMT with improved Combination of output from different translation engines with different confidence scores strong decoding capabilities September 7, 2005 AVENUE Project

37 Flat Seed Generation Create a transfer rule that is specific to the sentence pair, but abstracted to the POS level. No syntactic structure. Element Source SL POS sequence f-structure TL POS sequence TL dictionary, aligned SL words Type information corpus, same on SL and TL Alignments informant x-side constraints y-side constraints TL dictionary, aligned SL words (list of projecting features) September 7, 2005 AVENUE Project

38 Compositionality - Overview
Traverse the c-structure of the English sentence, add compositional structure for translatable chunks Adjust constituent sequences, alignments Remove unnecessary constraints, i.e. those that are contained in the lower-level rule September 7, 2005 AVENUE Project

39 Seeded Version Space Learning: Overview
Goal: add appropriate feature constraints to the acquired rules Methodology: Preserve general structural transfer Learn specific feature constraints from example set Seed rules are grouped into clusters of similar transfer structure (type, constituent sequences, alignments) Each cluster forms a version space: a partially ordered hypothesis space with a specific and a general boundary The seed rules in a group form the specific boundary of a version space The general boundary is the (implicit) transfer rule with the same type, constituent sequences, and alignments, but no feature constraints September 7, 2005 AVENUE Project

40 Seeded Version Space Learning: Generalization
The partial order of the version space: Definition: A transfer rule tr1 is strictly more general than another transfer rule tr2 if all f-structures that are satisfied by tr2 are also satisfied by tr1. Generalize rules by merging them: Deletion of constraint Raising two value constraints to an agreement constraint, e.g. ((x1 num) = *pl), ((x3 num) = *pl)  ((x1 num) = (x3 num)) September 7, 2005 AVENUE Project

41 Seeded Version Space Learning
NP v det n NP VP … Group seed rules into version spaces as above. Make use of partial order of rules in version space. Partial order is defined via the f-structures satisfying the constraints. Generalize in the space by repeated merging of rules: Deletion of constraint Moving value constraints to agreement constraints, e.g. ((x1 num) = *pl), ((x3 num) = *pl)  ((x1 num) = (x3 num) 4. Check translation power of generalized rules against sentence pairs September 7, 2005 AVENUE Project

42 Seeded Version Space Learning: The Search
The Seeded Version Space algorithm itself is the repeated generalization of rules by merging A merge is successful if the set of sentences that can correctly be translated with the merged rule is a superset of the union of sets that can be translated with the unmerged rules, i.e. check power of rule Merge until no more successful merges September 7, 2005 AVENUE Project

43 Conclusions Transfer rules (both manual and learned) offer significant contributions that can complement existing data-driven approaches Also in medium and large data settings? Initial steps to development of a statistically grounded transfer-based MT system with: Rules that are scored based on a well-founded probability model Strong and effective decoding that incorporates the most advanced techniques used in SMT decoding Working from the “opposite” end of research on incorporating models of syntax into “standard” SMT systems [Knight et al] Our direction makes sense in the limited data scenario September 7, 2005 AVENUE Project

44 AVENUE Architecture Run-Time Module Learning Module SL SL Morphology
Input SL Parser Morphology Pre-proc Elicitation Process Transfer Rule Learning Transfer Rules Transfer Engine TL Output TL Generator Decoder User September 7, 2005 AVENUE Project

45 Learning Transfer-Rules for Languages with Limited Resources
Rationale: Large bilingual corpora not available Bilingual native informant(s) can translate and align a small pre-designed elicitation corpus, using elicitation tool Elicitation corpus designed to be typologically comprehensive and compositional Transfer-rule engine and new learning approach support acquisition of generalized transfer-rules from the data September 7, 2005 AVENUE Project

46 The Elicitation Corpus
Translated, aligned by bilingual informant Corpus consists of linguistically diverse constructions Based on elicitation and documentation work of field linguists (e.g. Comrie 1977, Bouquiaux 1992) Organized compositionally: elicit simple structures first, then use them as building blocks Goal: minimize size, maximize linguistic coverage September 7, 2005 AVENUE Project

47 The Transfer Engine Analysis Transfer Generation
Source text is parsed into its grammatical structure. Determines transfer application ordering. Example: 他 看 书。(he read book) S NP VP N V NP 他 看 书 Transfer A target language tree is created by reordering, insertion, and deletion. he read DET N a book Article “a” is inserted into object NP. Source words translated with transfer lexicon. Generation Target language constraints are checked and final translation produced. E.g. “reads” is chosen over “read” to agree with “he”. Final translation: “He reads a book” September 7, 2005 AVENUE Project

48 Transfer Rule Formalism
;SL: the man, TL: der Mann NP::NP [DET N] -> [DET N] ( (X1::Y1) (X2::Y2) ((X1 AGR) = *3-SING) ((X1 DEF = *DEF) ((X2 AGR) = *3-SING) ((X2 COUNT) = +) ((Y1 AGR) = *3-SING) ((Y1 DEF) = *DEF) ((Y2 AGR) = *3-SING) ((Y2 GENDER) = (Y1 GENDER)) ) Type information Part-of-speech/constituent information Alignments x-side constraints y-side constraints xy-constraints, e.g. ((Y1 AGR) = (X1 AGR)) September 7, 2005 AVENUE Project

49 Transfer Rule Formalism (II)
;SL: the man, TL: der Mann NP::NP [DET N] -> [DET N] ( (X1::Y1) (X2::Y2) ((X1 AGR) = *3-SING) ((X1 DEF = *DEF) ((X2 AGR) = *3-SING) ((X2 COUNT) = +) ((Y1 AGR) = *3-SING) ((Y1 DEF) = *DEF) ((Y2 AGR) = *3-SING) ((Y2 GENDER) = (Y1 GENDER)) ) Value constraints Agreement constraints September 7, 2005 AVENUE Project

50 Rule Learning - Overview
Goal: Acquire Syntactic Transfer Rules Use available knowledge from the source side (grammatical structure) Three steps: Flat Seed Generation: first guesses at transfer rules; flat syntactic structure Compositionality: use previously learned rules to add hierarchical structure Seeded Version Space Learning: refine rules by generalizing with validation (learn appropriate feature constraints) September 7, 2005 AVENUE Project

51 Examples of Learned Rules (I)
{NP,14244} ;;Score:0.0429 NP::NP [N] -> [DET N] ( (X1::Y2) ) {NP,14434} ;;Score:0.0040 NP::NP [ADJ CONJ ADJ N] -> [ADJ CONJ ADJ N] (X1::Y1) (X2::Y2) (X3::Y3) (X4::Y4) {PP,4894} ;;Score: PP::PP [NP POSTP] -> [PREP NP] ( (X2::Y1) (X1::Y2) ) September 7, 2005 AVENUE Project

52 A Limited Data Scenario for Hindi-to-English
Put together a scenario with “miserly” data resources: Elicited Data corpus: phrases Cleaned portion (top 12%) of LDC dictionary: ~2725 Hindi words (23612 translation pairs) Manually acquired resources during the SLE: 500 manual bigram translations 72 manually written phrase transfer rules 105 manually written postposition rules 48 manually written time expression rules No additional parallel text!! September 7, 2005 AVENUE Project

53 Manual Grammar Development
Covers mostly NPs, PPs and VPs (verb complexes) ~70 grammar rules, covering basic and recursive NPs and PPs, verb complexes of main tenses in Hindi (developed in two weeks) September 7, 2005 AVENUE Project

54 Manual Transfer Rules: Example
;; PASSIVE OF SIMPLE PAST (NO AUX) WITH LIGHT VERB ;; passive of 43 (7b) {VP,28} VP::VP : [V V V] -> [Aux V] ( (X1::Y2) ((x1 form) = root) ((x2 type) =c light) ((x2 form) = part) ((x2 aspect) = perf) ((x3 lexwx) = 'jAnA') ((x3 form) = part) ((x3 aspect) = perf) (x0 = x1) ((y1 lex) = be) ((y1 tense) = past) ((y1 agr num) = (x3 agr num)) ((y1 agr pers) = (x3 agr pers)) ((y2 form) = part) ) September 7, 2005 AVENUE Project

55 Manual Transfer Rules: Example
NP PP NP1 NP P Adj N N1 ke eka aXyAya N jIvana NP NP PP Adj N P NP one chapter of N1 N life ; NP1 ke NP2 -> NP2 of NP1 ; Ex: jIvana ke eka aXyAya ; life of (one) chapter ; ==> a chapter of life ; {NP,12} NP::NP : [PP NP1] -> [NP1 PP] ( (X1::Y2) (X2::Y1) ; ((x2 lexwx) = 'kA') ) {NP,13} NP::NP : [NP1] -> [NP1] (X1::Y1) {PP,12} PP::PP : [NP Postp] -> [Prep NP] September 7, 2005 AVENUE Project

56 Adding a “Strong” Decoder
XFER system produces a full lattice Edges are scored using word-to-word translation probabilities, trained from the limited bilingual data Decoder uses an English LM (70m words) Decoder can also reorder words or phrases (up to 4 positions ahead) For XFER(strong) , ONLY edges from basic XFER system are used! September 7, 2005 AVENUE Project

57 Testing Conditions Tested on section of JHU provided data: 258 sentences with four reference translations SMT system (stand-alone) EBMT system (stand-alone) XFER system (naïve decoding) XFER system with “strong” decoder No grammar rules (baseline) Manually developed grammar rules Automatically learned grammar rules XFER+SMT with strong decoder (MEMT) September 7, 2005 AVENUE Project

58 Results on JHU Test Set (very miserly training data)
System BLEU M-BLEU NIST EBMT 0.058 0.165 4.22 SMT 0.093 0.191 4.64 XFER (naïve) man grammar 0.055 0.177 4.46 XFER (strong) no grammar 0.109 0.224 5.29 XFER (strong) learned grammar 0.116 0.231 5.37 XFER (strong) man grammar 0.135 0.243 5.59 XFER+SMT 0.136 5.65 September 7, 2005 AVENUE Project

59 Effect of Reordering in the Decoder
September 7, 2005 AVENUE Project

60 Observations and Lessons (I)
XFER with strong decoder outperformed SMT even without any grammar rules in the miserly data scenario SMT Trained on elicited phrases that are very short SMT has insufficient data to train more discriminative translation probabilities XFER takes advantage of Morphology Token coverage without morphology: Token coverage with morphology: Manual grammar currently somewhat better than automatically learned grammar Learned rules did not yet use version-space learning Large room for improvement on learning rules Importance of effective well-founded scoring of learned rules September 7, 2005 AVENUE Project

61 Observations and Lessons (II)
MEMT (XFER and SMT) based on strong decoder produced best results in the miserly scenario. Reordering within the decoder provided very significant score improvements Much room for more sophisticated grammar rules Strong decoder can carry some of the reordering “burden” September 7, 2005 AVENUE Project

62 Conclusions Transfer rules (both manual and learned) offer significant contributions that can complement existing data-driven approaches Also in medium and large data settings? Initial steps to development of a statistically grounded transfer-based MT system with: Rules that are scored based on a well-founded probability model Strong and effective decoding that incorporates the most advanced techniques used in SMT decoding Working from the “opposite” end of research on incorporating models of syntax into “standard” SMT systems [Knight et al] Our direction makes sense in the limited data scenario September 7, 2005 AVENUE Project

63 Future Directions Continued work on automatic rule learning (especially Seeded Version Space Learning) Improved leveraging from manual grammar resources, interaction with bilingual speakers Developing a well-founded model for assigning scores (probabilities) to transfer rules Improving the strong decoder to better fit the specific characteristics of the XFER model MEMT with improved Combination of output from different translation engines with different scorings strong decoding capabilities September 7, 2005 AVENUE Project

64 Rule Learning - Overview
Goal: Acquire Syntactic Transfer Rules Use available knowledge from the source side (grammatical structure) Three steps: Flat Seed Generation: first guesses at transfer rules; no syntactic structure Compositionality: use previously learned rules to add structure Seeded Version Space Learning: refine rules by generalizing with validation September 7, 2005 AVENUE Project

65 Flat Seed Generation Create a transfer rule that is specific to the sentence pair, but abstracted to the POS level. No syntactic structure. Element Source SL POS sequence f-structure TL POS sequence TL dictionary, aligned SL words Type information corpus, same on SL and TL Alignments informant x-side constraints y-side constraints TL dictionary, aligned SL words (list of projecting features) September 7, 2005 AVENUE Project

66 Flat Seed Generation - Example
The highly qualified applicant did not accept the offer. Der äußerst qualifizierte Bewerber nahm das Angebot nicht an. ((1,1),(2,2),(3,3),(4,4),(6,8),(7,5),(7,9),(8,6),(9,7)) S::S [det adv adj n aux neg v det n] -> [det adv adj n v det n neg vpart] (;;alignments: (x1:y1)(x2::y2)(x3::y3)(x4::y4)(x6::y8)(x7::y5)(x7::y9)(x8::y6)(x9::y7)) ;;constraints: ((x1 def) = *+) ((x4 agr) = *3-sing) ((x5 tense) = *past) …. ((y1 def) = *+) ((y3 case) = *nom) ((y4 agr) = *3-sing) …. ) September 7, 2005 AVENUE Project

67 Compositionality - Overview
Traverse the c-structure of the English sentence, add compositional structure for translatable chunks Adjust constituent sequences, alignments Remove unnecessary constraints, i.e. those that are contained in the lower-level rule Adjust constraints: use f-structure of correct translation vs. f-structure of incorrect translations to introduce context constraints September 7, 2005 AVENUE Project

68 Compositionality - Example
S::S [det adv adj n aux neg v det n] -> [det adv adj n v det n neg vpart] (;;alignments: (x1:y1)(x2::y2)(x3::y3)(x4::y4)(x6::y8)(x7::y5)(x7::y9)(x8::y6)(x9::y7)) ;;constraints: ((x1 def) = *+) ((x4 agr) = *3-sing) ((x5 tense) = *past) …. ((y1 def) = *+) ((y3 case) = *nom) ((y4 agr) = *3-sing) …. ) NP::NP [det AJDP n] -> [det ADJP n] ((x1::y1)… ((y3 agr) = *3-sing) ((x3 agr = *3-sing) ….) S::S [NP aux neg v det n] -> [NP v det n neg vpart] (;;alignments: (x1::y1)(x3::y5)(x4::y2)(x4::y6)(x5::y3)(x6::y4) ;;constraints: ((x2 tense) = *past) …. ((y1 def) = *+) ((y1 case) = *nom) …. ) September 7, 2005 AVENUE Project

69 Seeded Version Space Learning: Overview
Goal: further generalize the acquired rules Methodology: Preserve general structural transfer Consider relaxing specific feature constraints Seed rules are grouped into clusters of similar transfer structure (type, constituent sequences, alignments) Each cluster forms a version space: a partially ordered hypothesis space with a specific and a general boundary The seed rules in a group form the specific boundary of a version space The general boundary is the (implicit) transfer rule with the same type, constituent sequences, and alignments, but no feature constraints September 7, 2005 AVENUE Project

70 Seeded Version Space Learning
NP v det n NP VP … Group seed rules into version spaces as above. Make use of partial order of rules in version space. Partial order is defined via the f-structures satisfying the constraints. Generalize in the space by repeated merging of rules: Deletion of constraint Moving value constraints to agreement constraints, e.g. ((x1 num) = *pl), ((x3 num) = *pl)  ((x1 num) = (x3 num) 4. Check translation power of generalized rules against sentence pairs September 7, 2005 AVENUE Project

71 Seeded Version Space Learning: Example
S::S [NP aux neg v det n] -> [NP v det n neg vpart] (;;alignments: (x1::y1)(x3::y5)(x4::y2)(x4::y6)(x5::y3)(x6::y4) ;;constraints: ((x2 tense) = *past) …. ((y1 def) = *+) ((y1 case) = *nom) ((y1 agr) = *3-sing) … ) ((y3 agr) = *3-sing) ((y4 agr) = *3-sing)… ) S::S [NP aux neg v det n] -> [NP n det n neg vpart] ( ;;alignments: (x1::y1)(x3::y5) (x4::y2)(x4::y6) (x5::y3)(x6::y4) ;;constraints: ((x2 tense) = *past) ((y1 def) = *+) ((y1 case) = *nom) ((y4 agr) = (y3 agr)) … ) S::S [NP aux neg v det n] -> [NP v det n neg vpart] (;;alignments: (x1::y1)(x3::y5)(x4::y2)(x4::y6)(x5::y3)(x6::y4) ;;constraints: ((x2 tense) = *past) … ((y1 def) = *+) ((y1 case) = *nom) ((y1 agr) = *3-plu) … ((y3 agr) = *3-plu) ((y4 agr) = *3-plu)… ) September 7, 2005 AVENUE Project

72 Preliminary Evaluation
English to German Corpus of 141 ADJPs, simple NPs and sentences 10-fold cross-validation experiment Goals: Do we learn useful transfer rules? Does Compositionality improve generalization? Does VS-learning improve generalization? September 7, 2005 AVENUE Project

73 Summary of Results Average translation accuracy on cross-validation test set was 62% Without VS-learning: 43% Without Compositionality: 57% Average number of VSs: 24 Average number of sents per VS: 3.8 Average number of merges per VS: 1.6 Percent of compositional rules: 34% September 7, 2005 AVENUE Project

74 Conclusions New paradigm for learning transfer rules from pre-designed elicitation corpus Geared toward languages with very limited resources Preliminary experiments validate approach: compositionality and VS-learning improve generalization September 7, 2005 AVENUE Project

75 Future Work Larger, more diverse elicitation corpus
Additional languages (Mapudungun…) Less information on TL side Reverse translation direction Refine the various algorithms: Operators for VS generalization Generalization VS search Layers for compositionality User interactive verification September 7, 2005 AVENUE Project

76 Seeded Version Space Learning: Generalization
The partial order of the version space: Definition: A transfer rule tr1 is strictly more general than another transfer rule tr2 if all f-structures that are satisfied by tr2 are also satisfied by tr1. Generalize rules by merging them: Deletion of constraint Raising two value constraints to an agreement constraint, e.g. ((x1 num) = *pl), ((x3 num) = *pl)  ((x1 num) = (x3 num)) September 7, 2005 AVENUE Project

77 Seeded Version Space Learning: Merging Two Rules
Merging algorithm proceeds in three steps. To merge tr1 and tr2 into trmerged: Copy all constraints that are both in tr1 and tr2 into trmerged Consider tr1 and tr2 separately. For the remaining constraints in tr1 and tr2 , perform all possible instances of raising value constraints to agreement constraints. Repeat step 1. September 7, 2005 AVENUE Project

78 Seeded Version Space Learning: The Search
The Seeded Version Space algorithm itself is the repeated generalization of rules by merging A merge is successful if the set of sentences that can correctly be translated with the merged rule is a superset of the union of sets that can be translated with the unmerged rules, i.e. check power of rule Merge until no more successful merges September 7, 2005 AVENUE Project


Download ppt "Faculty: Alon Lavie, Jaime Carbonell, Lori Levin, Ralf Brown Students:"

Similar presentations


Ads by Google