1 / 85

Pushpak Bhattacharyya CSE Dept., IIT Bombay 8 th , 10 th March , 2011

CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 23, 24– Parsing Algorithms; Parsing in case of Ambiguity; Probabilistic Parsing). Pushpak Bhattacharyya CSE Dept., IIT Bombay 8 th , 10 th March , 2011

darva
Télécharger la présentation

Pushpak Bhattacharyya CSE Dept., IIT Bombay 8 th , 10 th March , 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS460/626 : Natural Language Processing/Speech, NLP and the Web(Lecture 23, 24– Parsing Algorithms; Parsing in case of Ambiguity; Probabilistic Parsing) Pushpak BhattacharyyaCSE Dept., IIT Bombay 8th, 10th March, 2011 (Lectures 21 and 22 were on Sentiment Analysis by Aditya Joshi)

  2. A note on Language Modeling • Example sentence • “ ^ The tortoise beat the hare in the race.” Guided Guidedby Guided by by frequency Language world KnowledgeKnowledge

  3. Parse Tree S NP VP DT N V NP PP The Tortoise beatDT N P NP the hare in DT N the race

  4. UNL Expression beat@past scn(scene) agt obj race@def tortoise@def hare@def

  5. Purpose of LM • Prediction of next word (Speech Processing) • Language Identification (for same script) • Belongingness check (parsing) • P(NP->DT N) means what is the probability that the ‘YIELD’ of the non terminal NP is DT N

  6. Need for Deep Parsing • Sentences are linear structures • But there is a hierarchy- a tree- hidden behind the linear structure • There are constituents and branches

  7. PPs are at the same level: flat with respect to the head word “book” No distinction in terms of dominance or c-command NP PP The AP PP book with the blue cover of poems big [The big book of poems with the Blue cover] is on the table.

  8. “Constituency test of Replacement” runs into problems • One-replacement: • I bought the big [book of poems with the blue cover] not the small [one] • One-replacement targets book of poems with the blue cover • Another one-replacement: • I bought the big [book of poems] with the blue cover not the small [one] with the red cover • One-replacement targets book of poems

  9. More deeply embedded structure NP N’1 The AP N’2 N’3 big PP N book with the blue cover PP of poems

  10. Grammar and Parsing Algorithms

  11. A simplified grammar • S  NP VP • NP  DT N | N • VP  V ADV | V

  12. Example Sentence People laugh • 2 3 Lexicon: People - N, V Laugh - N, V These are positions This indicate that both Noun and Verb is possible for the word “People”

  13. Top-Down Parsing State Backup State Action ----------------------------------------------------------------------------------------------------- 1. ((S) 1) - - 2. ((NP VP)1) - - 3a. ((DT N VP)1) ((N VP) 1) - 3b. ((N VP)1) - - 4. ((VP)2) - Consume “People” 5a. ((V ADV)2) ((V)2) - 6. ((ADV)3) ((V)2) Consume “laugh” 5b. ((V)2) - - 6. ((.)3) - Consume “laugh” Termination Condition : All inputs over. No symbols remaining. Note: Input symbols can be pushed back. Position of input pointer

  14. Discussion for Top-Down Parsing • This kind of searching is goal driven. • Gives importance to textual precedence (rule precedence). • No regard for data, a priori (useless expansions made).

  15. Bottom-Up Parsing Some conventions: N12 S1? -> NP12 ° VP2? Represents positions Work on the LHS done, while the work on RHS remaining End position unknown

  16. Bottom-Up Parsing (pictorial representation) S -> NP12 VP23° People Laugh 1 2 3 N12 N23 V12 V23 NP12 -> N12 ° NP23 -> N23 ° VP12 -> V12 ° VP23 -> V23 ° S1? -> NP12° VP2?

  17. Problem with Top-Down Parsing • Left Recursion • Suppose you have A-> AB rule. Then we will have the expansion as follows: • ((A)K) -> ((AB)K) -> ((ABB)K) ……..

  18. Combining top-down and bottom-up strategies

  19. Top-Down Bottom-Up Chart Parsing • Combines advantages of top-down & bottom-up parsing. • Does not work in case of left recursion. • e.g. – “People laugh” • People – noun, verb • Laugh – noun, verb • Grammar – S  NP VP NP  DT N | N VP  V ADV | V

  20. Transitive Closure People laugh 1 2 3 S NP VP NP N VP  V  NP DT N S  NPVP S  NP VP  NP N VP V ADV success VP V

  21. Arcs in Parsing • Each arc represents a chart which records • Completed work (left of ) • Expected work (right of )

  22. Example People laugh loudly 1 2 3 4 S  NP VP NP  N VP  V VP  V ADV NP  DT N S  NPVP VP  VADV S  NP VP NP  N VP V ADV S  NP VP VP V

  23. Dealing With Structural Ambiguity • Multiple parses for a sentence • The man saw the boy with a telescope. • The man saw the mountain with a telescope. • The man saw the boy with the ponytail. At the level of syntax, all these sentences are ambiguous. But semantics can disambiguate 2nd & 3rd sentence.

  24. Prepositional Phrase (PP) Attachment Problem V – NP1 – P – NP2 (Here P means preposition) NP2 attaches to NP1 ? or NP2 attaches to V ?

  25. Parse Trees for a Structurally Ambiguous Sentence Let the grammar be – S  NP VP NP  DT N | DT N PP PP  P NP VP  V NP PP | V NP For the sentence, “I saw a boy with a telescope”

  26. Parse Tree - 1 S NP VP N V NP Det N PP saw I a P NP boy Det N with a telescope

  27. Parse Tree -2 S NP VP PP N V NP P NP Det N saw I Det N with a boy a telescope

  28. Parsing Structural Ambiguity

  29. Parsing for Structurally Ambiguous Sentences Sentence “I saw a boy with a telescope” Grammar: S  NP VP NP  ART N | ART N PP | PRON VP  V NP PP | V NP ART  a | an | the N  boy | telescope PRON  I V  saw

  30. Ambiguous Parses • Two possible parses: • PP attached with Verb (i.e. I used a telescope to see) • ( S ( NP ( PRON “I” ) ) ( VP ( V “saw” ) ( NP ( (ART “a”) ( N “boy”)) ( PP (P “with”) (NP ( ART “a” ) ( N “telescope”))))) • PP attached with Noun (i.e. boy had a telescope) • ( S ( NP ( PRON “I” ) ) ( VP ( V “saw” ) ( NP ( (ART “a”) ( N “boy”) (PP (P “with”) (NP ( ART “a” ) ( N “telescope”))))))

  31. Top Down Parse

  32. Top Down Parse

  33. Top Down Parse

  34. Top Down Parse

  35. Top Down Parse

  36. Top Down Parse

  37. Top Down Parse

  38. Top Down Parse

  39. Top Down Parse

  40. Top Down Parse

  41. Top Down Parse

  42. Top Down Parse

  43. Top Down Parse

  44. Top Down Parse

  45. Top Down Parse

  46. Top Down Parsing - Observations Top down parsing gave us the Verb Attachment Parse Tree (i.e., I used a telescope) To obtain the alternate parse tree, the backup state in step 5 will have to be invoked Is there an efficient way to obtain all parses ?

  47. Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 Colour Scheme : Blue for Normal Parse Green for Verb Attachment Parse Purplefor Noun Attachment Parse Redfor Invalid Parse

  48. Bottom Up Parse NP12  PRON12 S1?NP12VP2? I saw a boy with a telescope 1 2 3 4 5 6 7 8

  49. Bottom Up Parse NP12  PRON12 VP2?V23NP3? S1?NP12VP2? VP2?V23NP3?PP?? I saw a boy with a telescope 1 2 3 4 5 6 7 8

  50. Bottom Up Parse NP12  PRON12 VP2?V23NP3? NP35 ART34N45 S1?NP12VP2? VP2?V23NP3?PP?? NP3?ART34N45PP5? I saw a boy with a telescope 1 2 3 4 5 6 7 8

More Related