1 / 12

Phonology

Phonology. [These slides are missing most examples and discussion from class …]. Spelling. Pronunciation. cat s dog s rose s kiss es. “kat s ” “dawg z ” “roz iz ” “kis iz ”. why?. phonology doesn’t care about the spelling (that’s just applied morphology). What is Phonology?.

aquila
Télécharger la présentation

Phonology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Phonology [These slides are missing most examples and discussion from class …] 600.465 - Intro to NLP - J. Eisner

  2. Spelling Pronunciation cats dogs roses kisses “kats” “dawgz” “roziz” “kisiz” why? phonology doesn’t care about the spelling (that’s just applied morphology) What is Phonology? • cat + -s • dog + -s • rose + -s • kiss + -s How do you pronounce a sequence of morphemes? Especially, how & why do you fix up the pronunciation at the seams between morphemes? 600.465 - Intro to NLP - J. Eisner

  3. What is Phonology? • A function twixt head and lip • What class of functions is allowed? • Differs from one language to next • Often complicated, but not arbitrary • Comp Sci: How to compute, invert, learn? Morphology (head) Phonological mapping Articulation (mouth) underlyingphonemes surfacephones resign resign + -ation ree-ZIYN reh-zihg-NAY-shun 600.465 - Intro to NLP - J. Eisner

  4. Rule 1 input (I) Rule 2 output (O) Rule 3 Successive Fixups for Phonology • Chomsky & Halle (1968) • Stepwise refinement of a single form • How to handle “resignation” example? • That is, O = f(I) = g3(g2(g1(I))) • Function composition (e.g., transducer composition) 600.465 - Intro to NLP - J. Eisner

  5. example courtesy of K. Crosswhite How to Give Orders • Directions version: • Break two eggs into a medium mixing bowl. • Remove this tab first. • On the last day of each month, come to this office and pay your rent. • Rules version: • No running in the house is allowed. • All dogs must be on a leash. • Rent must be paid by the first day of each month. • In rules version, describe what a good solution would look like, plus a search procedure for finding the best solution). Where else have we seen this? successive fixup (derivation) successive winnowing (optimization) 600.465 - Intro to NLP - J. Eisner

  6. Optimality Theoryfor Phonology • Prince & Smolensky (1993) • Alternative to successive fixups • Successive winnowing of candidate set . . . Gen Constraint 1 input Constraint 2 Constraint 3 output 600.465 - Intro to NLP - J. Eisner

  7. Optimality Theory “Tableau” «« = candidate violates constraint twice (weight 2) constraint would prefer A, but only allowed to break tie among B,D,E 600.465 - Intro to NLP - J. Eisner

  8. . . . Gen Constraint 1 input (I) Constraint 2 Constraint 3 output (O) Optimality Theoryfor Phonology adds weights to candidates adds weights to candidates best paths (several may tie) best paths (breaks some ties) 600.465 - Intro to NLP - J. Eisner

  9. When do we prune back to best paths? • Optimality Theory: At each intermediate stage • Noisy channel: After adding up all weights . . . output (O) 600.465 - Intro to NLP - J. Eisner

  10. Why does order matter? • Optimality Theory: Each machine (FSA) can choose only among outputs that previous machines liked best • Noisy channel: Each machine (FST) alters the output produced by previous machines . . . output (O) 600.465 - Intro to NLP - J. Eisner

  11. . . . Gen Constraint 1 input (I) Constraint 2 Constraint 3 output (O) Final Remark on OT Repeated best-paths only works for a single input Better to build full FST for I  O (invertible) Can do this e.g. if every constraint is binary: Assigns each candidate either 1 star (“bad”) or 0 stars (“good”) 600.465 - Intro to NLP - J. Eisner

  12. Optimality Theory “Tableau” all surviving candidates violate constraint 3, so we can’t eliminate any 600.465 - Intro to NLP - J. Eisner

More Related