Presentation is loading. Please wait.

Presentation is loading. Please wait.

600.465 - Intro to NLP - J. Eisner1 Phonology [These slides are missing most examples and discussion from class …]

Similar presentations


Presentation on theme: "600.465 - Intro to NLP - J. Eisner1 Phonology [These slides are missing most examples and discussion from class …]"— Presentation transcript:

1 600.465 - Intro to NLP - J. Eisner1 Phonology [These slides are missing most examples and discussion from class …]

2 600.465 - Intro to NLP - J. Eisner2 “kats” “dawgz” “roziz” “kisiz” Pronunciation What is Phonology?  cat + -s  dog + -s  rose + -s  kiss + -s cats dogs roses kisses Spelling How do you pronounce a sequence of morphemes? Especially, how & why do you fix up the pronunciation at the seams between morphemes? why? phonology doesn’t care about the spelling (that’s just applied morphology)

3 600.465 - Intro to NLP - J. Eisner3 “næpt” “næbd” “nadid” “natid” Pronunciation What is Phonology?  nap + -t  nab + -t  nod + -t  knot + -t napped nabbed nodded knotted Spelling Actually, these are pronounced identically: na∂ I d thanks to the English “flapping” rule (similarly: ladder/latter, bedding/betting)

4 600.465 - Intro to NLP - J. Eisner4 “Trisyllabic Shortening” in English What is Phonology? divine  divinity futile  futility senile  senility satire  satirical decide  decision wild  wilderness serene  serenity supreme  supremity obscene  obscenity obese  *obesity (and similarly for other vowels)

5 600.465 - Intro to NLP - J. Eisner5 What is Phonology?  A function twixt head and lip  What class of functions is allowed ?  Differs from one language to next  Often complicated, but not arbitrary  Comp Sci: How to compute, invert, learn? Morphology (head) Phonological mapping Articulation (mouth) underlying phonemes surface phones ree-ZIYN reh-zihg-NAY-shun resign resign + -ation

6 600.465 - Intro to NLP - J. Eisner6 Successive Fixups for Phonology  Chomsky & Halle (1968)  Stepwise refinement of a single form  How to handle “resignation” example?  That is, O = f( I ) = g 3 (g 2 (g 1 ( I )))  Function composition (e.g., transducer composition) Rule 1 Rule 2 input (I) output (O) Rule 3

7 600.465 - Intro to NLP - J. Eisner7 How to Give Orders  Directions version:  Break two eggs into a medium mixing bowl.  Remove this tab first.  On the last day of each month, come to this office and pay your rent.  Rules version:  No running in the house is allowed.  All dogs must be on a leash.  Rent must be paid by the first day of each month.  In rules version, describe what a good solution would look like, plus a search procedure for finding the best solution). Where else have we seen this? successive fixup (derivation) successive winnowing (optimization) example courtesy of K. Crosswhite

8 600.465 - Intro to NLP - J. Eisner8 Optimality Theory for Phonology  Prince & Smolensky (1993)  Alternative to successive fixups  Successive winnowing of candidate set Gen Constraint 1 Constraint 2 Constraint 3 input... output

9 600.465 - Intro to NLP - J. Eisner9 Optimality Theory “Tableau” constraint would prefer A, but only allowed to break tie among B,D,E  = candidate violates constraint twice (weight 2)

10 600.465 - Intro to NLP - J. Eisner10 Optimality Theory for Phonology Gen Constraint 1 Constraint 2 Constraint 3 input (I)... output (O) adds weights to candidates adds weights to candidates best paths (several may tie) best paths (breaks some ties)

11 600.465 - Intro to NLP - J. Eisner11 When do we prune back to best paths?  Optimality Theory: At each intermediate stage  Noisy channel: After adding up all weights... output (O)

12 600.465 - Intro to NLP - J. Eisner12 Why does order matter?  Optimality Theory: Each machine (FSA) can choose only among outputs that previous machines liked best  Noisy channel: Each machine (FST) alters the output produced by previous machines... output (O)

13 600.465 - Intro to NLP - J. Eisner13 Final Remark on OT Repeated best-paths only works for a single input Better to build full FST for I  O (invertible) Can do this e.g. if every constraint is binary: Assigns each candidate either 1 star (“bad”) or 0 stars (“good”) Gen Constraint 1 Constraint 2 Constraint 3 input (I)... output (O)

14 600.465 - Intro to NLP - J. Eisner14 Optimality Theory “Tableau” all surviving candidates violate constraint 3, so we can’t eliminate any


Download ppt "600.465 - Intro to NLP - J. Eisner1 Phonology [These slides are missing most examples and discussion from class …]"

Similar presentations


Ads by Google