1 / 49

Probabilistic Earley Parsing

Probabilistic Earley Parsing. Charlie Kehoe, Spring 2004. Based on the 1995 paper by Andreas Stolcke: An Efficient Probabilistic Context-Free Parsing Algorithm that Computes Prefix Probabilities. Overview. What is this paper all about? Key ideas from the title: Context-Free Parsing

carmell
Télécharger la présentation

Probabilistic Earley Parsing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ProbabilisticEarley Parsing Charlie Kehoe, Spring 2004 Based on the 1995 paper by Andreas Stolcke: An Efficient Probabilistic Context-Free Parsing Algorithm that Computes Prefix Probabilities

  2. Overview • What is this paper all about? • Key ideas from the title: • Context-Free Parsing • Probabilistic • Computes Prefix Probabilities • Efficient

  3. Context-Free Parsing The ball is heavy. Parser The ball is heavy

  4. Context-Free Parsing The ball is heavy. Grammar Parser The ball is heavy

  5. Context-Free Parsing The ball is heavy. Grammar Parser Lexicon The ball is heavy

  6. Context-Free Parsing • What if there are multiple legal parses? • Example: “Yair looked over the paper.” • How does the word “over” function? S S NP VP NP VP or PP N V NP N V P NP Yair looked over the paper Yair looked over the paper

  7. Probabilistic Parsing • Use probabilities to find the most likely parse • Store typical probabilities for words and rules • In this case: S S NP VP NP VP or PP N V NP N V P NP Yair looked over the paper Yair looked over the paper P = 0.99 P = 0.01

  8. Prefix Probabilities • How likely is a partial parse? S S NP VP NP VP or PP N V NP N V P NP Yair looked over … Yair looked over …

  9. Efficiency • The Earley algorithm (upon which Stolcke builds) is one of the most efficient known parsing algorithms • Probabilities allow intelligent pruning of the developing parse tree(s)

  10. Parsing Algorithms • How do we construct a parse tree? • Work from grammar to sentence (top-down) • Work from sentence to grammar (bottom-up) • Work from both ends at once (Earley) • Predictably, Earley works best

  11. Earley Parsing Overview • Start with a root constituent, e.g. sentence • Until the sentence is complete, repeatedly • Predict: expand constituents as specified in the grammar • Scan: match constituents with words in the input • Complete: propagate constituent completions up to their parents • Prediction is top-down, while scanning and completion are bottom-up

  12. Earley Parsing Overview • Earley parsing uses a chart rather than a tree to develop the parse • Constituents are stored independently, indexed by word positions in the sentence • Why do this? • Eliminate recalculation when tree branches are abandoned and later rebuilt • Concisely represent multiple parses

  13. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  14. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  15. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  16. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  17. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  18. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  19. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  20. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  21. Earley Parsing Example S → NP VP NP → ART N VP → V ADJ

  22. Probabilistic Parsing • How do we parse probabilistically? • Assign probabilities to grammar rules and words in lexicon • Grammar and lexicon “randomly” generate all possible sentences in the language • P(parse tree) = P(sentence generation)

  23. Probabilistic Parsing • Terminology • Earley state: each step of the processing that a constituent undergoes. Examples: • Starting sentence • Half-finished sentence • Complete sentence • Half-finished noun phrase • etc. • Earley path: a sequence of linked states • Example: the complete parse just described

  24. Probabilistic Parsing • Can represent the parse as a Markov chain: • Markov assumption (state probability is independent of path) applies, due to CFG NP ► ART NBegin NP Done NP Half Done S Half Done S ► NP VP Begin Scan “the” Scan “ball” Complete NP S Begin Predict NP etc. NP ► PNBegin Predict S (path abandoned)

  25. Probabilistic Parsing • Every Earley path corresponds to a parse tree • P(tree) = P(path) • Assign a probability to each state transition • Prediction: probability of grammar rule • Scanning: probability of word in lexicon • Completion: accumulated (“inner”) probability of the finished constituent • P(path) = product of P(transition)s

  26. Probabilistic Parse Example Grammar Lexicon

  27. Probabilistic Parse Example

  28. Probabilistic Parse Example

  29. Probabilistic Parse Example

  30. Probabilistic Parse Example

  31. Probabilistic Parse Example

  32. Probabilistic Parse Example

  33. Probabilistic Parse Example

  34. Probabilistic Parse Example

  35. Probabilistic Parse Example

  36. Probabilistic Parse Example

  37. Probabilistic Parse Example

  38. Prefix Probabilities • Current algorithm reports parse tree probability when the sentence is completed • What if we don’t have a full sentence? • Probability is tracked by constituent (“inner”), rather than by path (“forward”)

  39. Prefix Probabilities • Solution: add a separate path probability • Same as before, but propagate down on prediction step • This is the missing link to chain the path probabilities together

  40. Prefix Probability Example

  41. Prefix Probability Example

  42. Prefix Probability Example

  43. Prefix Probability Example

  44. Prefix Probability Example

  45. Prefix Probability Example

  46. Prefix Probability Example

  47. Prefix Probability Example

  48. Prefix Probability Example

  49. Summary • Use Earley chart parser for efficient parsing, even with ambiguous or complex sentences • Use probabilities to choose among multiple possible parse trees • Track constituent probability for complete sentences • Also track path probability for incomplete sentences

More Related