1 / 83

CS60057 Speech &Natural Language Processing

CS60057 Speech &Natural Language Processing. Autumn 2007. Lecture 12 22 August 2007. LING 180 SYMBSYS 138 Intro to Computer Speech and Language Processing. Lecture 13: Grammar and Parsing (I) October 24, 2006 Dan Jurafsky. Thanks to Jim Martin for many of these slides!.

zeki
Télécharger la présentation

CS60057 Speech &Natural Language Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS60057Speech &Natural Language Processing Autumn 2007 Lecture 12 22 August 2007 Natural Language Processing

  2. LING 180 SYMBSYS 138Intro to Computer Speech and Language Processing Lecture 13: Grammar and Parsing (I) October 24, 2006 Dan Jurafsky Thanks to Jim Martin for many of these slides! Natural Language Processing

  3. Outline for Grammar/Parsing Week • Context-Free Grammars and Constituency • Some common CFG phenomena for English • Sentence-level constructions • NP, PP, VP • Coordination • Subcategorization • Top-down and Bottom-up Parsing • Earley Parsing • Quick sketch of probabilistic parsing Natural Language Processing

  4. Review • Parts of Speech • Basic syntactic/morphological categories that words belong to • Part of Speech tagging • Assigning parts of speech to all the words in a sentence Natural Language Processing

  5. Syntax • Syntax: from Greek syntaxis, “setting out together, arrangmenet’ • Refers to the way words are arranged together, and the relationship between them. • Distinction: • Prescriptive grammar: how people ought to talk • Descriptive grammar: how they do talk • Goal of syntax is to model the knowledge of that people unconsciously have about the grammar of their native language Natural Language Processing

  6. Syntax • Why should you care? • Grammar checkers • Question answering • Information extraction • Machine translation Natural Language Processing

  7. 4 key ideas of syntax • Constituency (we’ll spend most of our time on this) • Grammatical relations • Subcategorization • Lexical dependencies Plus one part we won’t have time for: • Movement/long-distance dependency Natural Language Processing

  8. Context-Free Grammars • Capture constituency and ordering • Ordering: • What are the rules that govern the ordering of words and bigger units in the language? • Constituency: How words group into units and how the various kinds of units behave Natural Language Processing

  9. Constituency • Noun phrases (NPs) • Three parties from Brooklyn • A high-class spot such as Mindy’s • The Broadway coppers • They • Harry the Horse • The reason he comes into the Hot Box • How do we know these form a constituent? • They can all appear before a verb: • Three parties from Brooklyn arrive… • A high-class spot such as Mindy’s attracts… • The Broadway coppers love… • They sit… Natural Language Processing

  10. Constituency (II) • They can all appear before a verb: • Three parties from Brooklyn arrive… • A high-class spot such as Mindy’s attracts… • The Broadway coppers love… • They sit • But individual words can’t always appear before verbs: • *from arrive… • *as attracts… • *the is • *spot is… • Must be able to state generalizations like: • Noun phrases occur before verbs Natural Language Processing

  11. Constituency (III) • Preposing and postposing: • On September 17th, I’d like to fly from Atlanta to Denver • I’d like to fly on September 17th from Atlanta to Denver • I’d like to fly from Atlanta to Denver on September 17th. • But not: • *On September, I’d like to fly 17th from Atlanta to Denver • *On I’d like to fly September 17th from Atlanta to Denver Natural Language Processing

  12. CFG Examples • S -> NP VP • NP -> Det NOMINAL • NOMINAL -> Noun • VP -> Verb • Det -> a • Noun -> flight • Verb -> left Natural Language Processing

  13. CFGs • S -> NP VP • This says that there are units called S, NP, and VP in this language • That an S consists of an NP followed immediately by a VP • Doesn’t say that that’s the only kind of S • Nor does it say that this is the only place that NPs and VPs occur Natural Language Processing

  14. Generativity • As with FSAs and FSTs you can view these rules as either analysis or synthesis machines • Generate strings in the language • Reject strings not in the language • Impose structures (trees) on strings in the language Natural Language Processing

  15. Derivations • A derivation is a sequence of rules applied to a string that accounts for that string • Covers all the elements in the string • Covers only the elements in the string Natural Language Processing

  16. Derivations as Trees Natural Language Processing

  17. CFGs more formally • A context-free grammar has 4 parameters (“is a 4-tuple”) • A set of non-terminal symbols (“variables”) N • A set of terminal symbols  (disjoint from N) • A set of productions P, each of the form • A ->  • Where A is a non-terminal and  is a string of symbols from the infinite set of strings (  N)* • A designated start symbol S Natural Language Processing

  18. Defining a CF language via derivation • A string A derives a string B if • A can be rewritten as B via some series of rule applications • More formally: • If A -> is a production of P •  and are any strings in the set (  N)* • Then we say that • A directly derives  • Or A • Derivation is a generalization of direct derivation • Let 1, 2, … m be strings in (  N)*, m>= 1, s.t. • 12, 23… m-1m • We say that 1derives m or 1*m • We then formally define language LG generated by grammar G • As set of strings composed of terminal symbols derived from S • LG = {w | w is in * and S * w} Natural Language Processing

  19. Parsing • Parsing is the process of taking a string and a grammar and returning a (many?) parse tree(s) for that string Natural Language Processing

  20. Context? • The notion of context in CFGs has nothing to do with the ordinary meaning of the word context in language. • All it really means is that the non-terminal on the left-hand side of a rule is out there all by itself (free of context) A -> B C Means that I can rewrite an A as a B followed by a C regardless of the context in which A is found Natural Language Processing

  21. Key Constituents (English) • Sentences • Noun phrases • Verb phrases • Prepositional phrases Natural Language Processing

  22. Sentence-Types • Declaratives: A plane left S -> NP VP • Imperatives: Leave! S -> VP • Yes-No Questions: Did the plane leave? S -> Aux NP VP • WH Questions: When did the plane leave? S -> WH Aux NP VP Natural Language Processing

  23. NPs • NP -> Pronoun • I came, you saw it, they conquered • NP -> Proper-Noun • Los Angeles is west of Texas • John Hennesey is the president of Stanford • NP -> Det Noun • The president • NP -> Nominal • Nominal -> Noun Noun • A morning flight to Denver Natural Language Processing

  24. PPs • PP -> Preposition NP • From LA • To Boston • On Tuesday • With lunch Natural Language Processing

  25. Recursion • We’ll have to deal with rules such as the following where the non-terminal on the left also appears somewhere on the right (directly). NP -> NP PP [[The flight] [to Boston]] VP -> VP PP [[departed Miami] [at noon]] Natural Language Processing

  26. Recursion • Of course, this is what makes syntax interesting flights from Denver Flights from Denver to Miami Flights from Denver to Miami in February Flights from Denver to Miami in February on a Friday Flights from Denver to Miami in February on a Friday under $300 Flights from Denver to Miami in February on a Friday under $300 with lunch Natural Language Processing

  27. Recursion • Of course, this is what makes syntax interesting [[flights] [from Denver]] [[[Flights] [from Denver]] [to Miami]] [[[[Flights] [from Denver]] [to Miami]] [in February]] [[[[[Flights] [from Denver]] [to Miami]] [in February]] [on a Friday]] Etc. Natural Language Processing

  28. Implications of recursion and context-freeness • If you have a rule like • VP -> V NP • It only cares that the thing after the verb is an NP. It doesn’t have to know about the internal affairs of that NP Natural Language Processing

  29. The Point • VP -> V NP • I hate flights from Denver Flights from Denver to Miami Flights from Denver to Miami in February Flights from Denver to Miami in February on a Friday Flights from Denver to Miami in February on a Friday under $300 Flights from Denver to Miami in February on a Friday under $300 with lunch Natural Language Processing

  30. Bracketed Notation • [S [NP [PRO I] [VP [V prefer [NP [NP [Det a] [Nom [N morning] [N flight]]]] Natural Language Processing

  31. Coordination Constructions • S -> S and S • John went to NY and Mary followed him • NP -> NP and NP • VP -> VP and VP • … • In fact the right rule for English is X -> X and X Natural Language Processing

  32. Problems • Agreement • Subcategorization • Movement (for want of a better term) Natural Language Processing

  33. This dog Those dogs This dog eats Those dogs eat *This dogs *Those dog *This dog eat *Those dogs eats Agreement Natural Language Processing

  34. S -> NP VP NP -> Det Nominal VP -> V NP … SgS -> SgNP SgVP PlS -> PlNp PlVP SgNP -> SgDet SgNom PlNP -> PlDet PlNom PlVP -> PlV NP SgVP ->SgV Np … Possible CFG Solution Natural Language Processing

  35. CFG Solution for Agreement • It works and stays within the power of CFGs • But its ugly • And it doesn’t scale all that well Natural Language Processing

  36. Subcategorization • Sneeze: John sneezed • Find: Please find [a flight to NY]NP • Give: Give [me]NP[a cheaper fare]NP • Help: Can you help [me]NP[with a flight]PP • Prefer: I prefer [to leave earlier]TO-VP • Said: You said [United has a flight]S • … Natural Language Processing

  37. Subcategorization • *John sneezed the book • *I prefer United has a flight • *Give with a flight • Subcat expresses the constraints that a predicate (verb for now) places on the number and syntactic types of arguments it wants to take (occur with). Natural Language Processing

  38. So? • So the various rules for VPs overgenerate. • They permit the presence of strings containing verbs and arguments that don’t go together • For example • VP -> V NP • therefore Sneezed the book is a VP since “sneeze” is a verb and “the book” is a valid NP Natural Language Processing

  39. Subcategorization • Sneeze: John sneezed • Find: Please find [a flight to NY]NP • Give: Give [me]NP[a cheaper fare]NP • Help: Can you help [me]NP[with a flight]PP • Prefer: I prefer [to leave earlier]TO-VP • Told: I was told [United has a flight]S • … Natural Language Processing

  40. Forward Pointer • It turns out that verb subcategorization facts will provide a key element for semantic analysis (determining who did what to who in an event). Natural Language Processing

  41. Possible CFG Solution • VP -> V • VP -> V NP • VP -> V NP PP • … • VP -> IntransV • VP -> TransV NP • VP -> TransVwPP NP PP • … Natural Language Processing

  42. Movement • Core example • My travel agent booked the flight Natural Language Processing

  43. Movement • Core example • [[My travel agent]NP [booked [the flight]NP]VP]S • I.e. “book” is a straightforward transitive verb. It expects a single NP arg within the VP as an argument, and a single NP arg as the subject. Natural Language Processing

  44. Movement • What about? • Which flight do you want me to have the travel agent book? • The direct object argument to “book” isn’t appearing in the right place. It is in fact a long way from where its supposed to appear. • And note that its separated from its verb by 2 other verbs. Natural Language Processing

  45. CFGs: a summary • CFGs appear to be just about what we need to account for a lot of basic syntactic structure in English. • But there are problems • That can be dealt with adequately, although not elegantly, by staying within the CFG framework. • There are simpler, more elegant, solutions that take us out of the CFG framework (beyond its formal power) • Syntactic theories: HPSG, LFG, CCG, Minimalism, etc Natural Language Processing

  46. Other Syntactic stuff • Grammatical Relations • Subject • I booked a flight to New York • The flight was booked by my agent. • Object • I booked a flight to New York • Complement • I said that I wanted to leave Natural Language Processing

  47. Dependency Parsing • Word to word links instead of constituency • Based on the European rather than American traditions • But dates back to the Greeks • The original notions of Subject, Object and the progenitor of subcategorization (called ‘valence’) came out of Dependency theory. • Dependency parsing is quite popular as a computational model • Since relationships between words are quite useful Natural Language Processing

  48. Parsing • Parsing: assigning correct trees to input strings • Correct tree: a tree that covers all and only the elements of the input and has an S at the top • For now: enumerate all possible trees • A further task: disambiguation: means choosing the correct tree from among all the possible trees. Natural Language Processing

  49. Parsing: examples using Rion Snow’s visualizer • Stanford parser • http://ai.stanford.edu/~rion/parsing/stanford_viz.html • Minipar parser • http://ai.stanford.edu/~rion/parsing/minipar_viz.html • Link grammar parser • http://ai.stanford.edu/~rion/parsing/linkparser_viz.html Natural Language Processing

  50. Treebanks • Parsed corpora in the form of trees • Examples: Natural Language Processing

More Related