Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Minimally Supervised Morphological Analysis by Multimodal Alignment David Yarowsky and Richard Wicentowski.

Similar presentations


Presentation on theme: "1 Minimally Supervised Morphological Analysis by Multimodal Alignment David Yarowsky and Richard Wicentowski."— Presentation transcript:

1 1 Minimally Supervised Morphological Analysis by Multimodal Alignment David Yarowsky and Richard Wicentowski

2 2 Introduction The Algorithm capable of inducing inflectional morphological analyses of regular and highly irregular forms. The Algorithm combines four original alignment models based on: Relative corpus frequency. Contextual Similarity. Weighted string similarity. Incrementally retrained inflectional transduction probabilities.

3 3 Lecture ’ s Subjects Task definition. Required and Optional resources. The Algorithm. Empirical Evaluation.

4 4 Task Definition Consider this task as three steps: Estimate a probabilistic alignment between inflected forms and root forms. Train a supervised morphological analysis learner on a weighted subset of these aligned pairs. Use the result from step 2 to iteratively refine the alignment in step 1.

5 5 Example (POS) Definitions:

6 6 Task Definition cont. The target output of step 1:

7 7 Required and Optional resources For the given language we need: A table of the inflectional Part of Speech (POS). A list of the canonical suffixes. A large text corpus.

8 8 Required and Optional resources cont. A list of the candidate noun, verb and adjective roots (from dictionary), and any rough mechanism for identifying the candidates POS of the remaining vocabulary. (not based on morphological analysis). A list of the consonants and vowels.

9 9 Required and Optional resources cont. A list of common function words. A distance/similarity tables generated on previously studied languages. Not essential If available

10 10 The Algorithm Combines four original alignment models: Alignment by Frequency Similarity. Alignment by Context Similarity. Alignment by Weighted Levenshtein Distance. Alignment by Morphological Transformation Probabilities.

11 11 Lemma Alignment by Frequency Similarity The motivating dilemma: singsinged VBD ? singsang VBD ? taketaked VBD ?

12 12 Lemma Alignment by Frequency Similarity cont. This Table is based on relative corpus frequency:

13 13 Lemma Alignment by Frequency Similarity cont.

14 14 Lemma Alignment by Frequency Similarity cont. A problem: the true alignments between inflections are unknown in advance. A simplifying assumption: the frequency ratios between inflections and roots is not significantly different between regular and irregular morphological processes.

15 15 Lemma Alignment by Frequency Similarity cont. Similarity between regular and irregular forms:

16 16 Lemma Alignment by Frequency Similarity cont. The expected frequency should also be estimable from the frequency of any of the other inflectional variants. VBD/VBG and VBD/VBZ could also be used as estimators.

17 17 Lemma Alignment by Frequency Similarity cont.

18 18 Lemma Alignment by Context Similarity Based on contextual similarity of the candidate form. Computing similarity between vectors of weighted and filtered context features. Clustering inflectional variants of verbs (e.g. sipped, sipping, and sip).

19 19 Lemma Alignment by Context Similarity cont. Example: CW subj (AUX|NEG)*V keyword DET?CW*CW obj eatingtheappleShlomois

20 20 Lemma Alignment by Weighted Levenshtein Distance Consider overall stem edit distance. A cost matrix with initial distance costs: initially set to (0.5,0.6,1.0,0.98)

21 21 Lemma Alignment by Morphological Transformation Probabilities The goal is to generalize a mapping function via a generative probabilistic model.

22 22 Lemma Alignment by Morphological Transformation Probabilities Result table:

23 23 Lemma Alignment by Morphological Transformation Probabilities cont. + +  P(inflection | root,suffix,POS)=P(stemchange | root,suffix,POS) unique

24 24 Lemma Alignment by Morphological Transformation Probabilities cont. Example:

25 25 Lemma Alignment by Morphological Transformation Probabilities cont. Example: P(solidified | solidify, +ed, VBD) = P(y  i | solidify, +ed, VBD) ≈ 1 P(y  i | ify, +ed) + (1- 1 )( 2 P(y  i | fy, +ed) + (1- 2 )( 3 P(y  i | y, +ed) + (1- 3 )( 4 P(y  i | +ed) + (1- 4 ) P(y  i) POS can be deleted

26 26 Lemma Alignment by Model Combination and the Pigeonhole Principle No single model is sufficiently effective on its own. The Frequency, Levenshtein and Context Similarity models retain equal relative weight. The Morphological Transformation Similarity model increases in relative weight.

27 27 Lemma Alignment by Model Combination and the Pigeonhole Principle Example:

28 28 Lemma Alignment by Model Combination and the Pigeonhole Principle cont. The final alignment is based on the pigeonhole principle. For a given POS a root shouldn't have more than one inflection nor should multiple inflections in the same POS share the same root.

29 29 Empirical Evaluation Performance:


Download ppt "1 Minimally Supervised Morphological Analysis by Multimodal Alignment David Yarowsky and Richard Wicentowski."

Similar presentations


Ads by Google