1 / 159

CS 388: Natural Language Processing: Semantic Parsing

CS 388: Natural Language Processing: Semantic Parsing. Raymond J. Mooney University of Texas at Austin. 1. 1. Representing Meaning. Representing the meaning of natural language is ultimately a difficult philosophical question, i.e. the “meaning of meaning”.

tosca
Télécharger la présentation

CS 388: Natural Language Processing: Semantic Parsing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 388: Natural Language Processing:Semantic Parsing Raymond J. Mooney University of Texas at Austin 1 1

  2. Representing Meaning • Representing the meaning of natural language is ultimately a difficult philosophical question, i.e. the “meaning of meaning”. • Traditional approach is to map ambiguous NL to unambiguous logic in first-order predicate calculus (FOPC). • Standard inference (theorem proving) methods exist for FOPC that can determine when one statement entails (implies) another. Questions can be answered by determining what potential responses are entailed by given NL statements and background knowledge all encoded in FOPC.

  3. Model Theoretic Semantics • Meaning of traditional logic is based on model theoretic semantics which defines meaning in terms of a model (a.k.a. possible world), a set-theoretic structure that defines a (potentially infinite) set of objects with properties and relations between them. • A model is a connecting bridge between language and the world by representing the abstract objects and relations that exist in a possible world. • An interpretation is a mapping from logic to the model that defines predicates extensionally, in terms of the set of tuples of objects that make them true (their denotation or extension). • The extension of Red(x) is the set of all red things in the world. • The extension of Father(x,y) is the set of all pairs of objects <A,B> such that A is B’s father.

  4. Truth-Conditional Semantics • Model theoretic semantics gives the truth conditions for a sentence, i.e. a model satisfies a logical sentence iff the sentence evaluates to true in the given model. • The meaning of a sentence is therefore defined as the set of all possible worlds in which it is true.

  5. Semantic Parsing • Semantic Parsing: Transforming natural language (NL) sentences into completely formal logical forms or meaning representations (MRs). • Sample application domains where MRs are directly executable by another computer system to perform some task. • CLang: Robocup Coach Language • Geoquery: A Database Query Application

  6. CLang: RoboCup Coach Language • In RoboCup Coach competition teams compete to coach simulated players [http://www.robocup.org] • The coaching instructions are given in a formal language called CLang [Chen et al. 2003] If the ball is in our goal area then player 1 should intercept it. Simulated soccer field Semantic Parsing (bpos (goal-area our) (do our {1} intercept)) CLang

  7. Geoquery: A Database Query Application • Query application for U.S. geography database containing about 800 facts [Zelle & Mooney, 1996] Which rivers run through the states bordering Texas? Arkansas,Canadian,Cimarron, Gila,Mississippi, RioGrande … Answer Semantic Parsing Query answer(traverse(next_to(stateid(‘texas’)))) answer(traverse(next_to(stateid(‘texas’)))) answer(traverse(next_to(stateid(‘texas’))))

  8. Procedural Semantics • The meaning of a sentence is a formal representation of a procedure that performs some action that is an appropriate response. • Answering questions • Following commands • In philosophy, the “late” Wittgenstein was known for the “meaning as use” view of semantics compared to the model theoretic view of the “early” Wittgenstein and other logicians.

  9. Most existing work on computational semantics is based on predicate logic What is the smallest state by area? answer(x1,smallest(x2,(state(x1),area(x1,x2)))) x1 is a logical variable that denotes “the smallest state by area” Predicate Logic Query Language 9

  10. Functional Query Language (FunQL) Transform a logical language into a functional,variable-free language (Kate et al., 2005) What is the smallest state by area? answer(x1,smallest(x2,(state(x1),area(x1,x2)))) answer(smallest_one(area_1(state(all)))) 10

  11. Semantic-Parser Learner Semantic Parser Meaning Rep Natural Language Learning Semantic Parsers • Manually programming robust semantic parsers is difficult due to the complexity of the task. • Semantic parsers can be learned automatically from sentences paired with their logical form. NLMR Training Exs

  12. Engineering Motivation • Most computational language-learning research strives for broad coverage while sacrificing depth. • “Scaling up by dumbing down” • Realistic semantic parsing currently entails domain dependence. • Domain-dependent natural-language interfaces have a large potential market. • Learning makes developing specific applications more tractable. • Training corpora can be easily developed by tagging existing corpora of formal statements with natural-language glosses.

  13. Cognitive Science Motivation • Most natural-language learning methods require supervised training data that is not available to a child. • General lack of negative feedback on grammar. • No POS-tagged or treebank data. • Assuming a child can infer the likely meaning of an utterance from context, NLMR pairs are more cognitively plausible training data.

  14. Our Semantic-Parser Learners • CHILL+WOLFIE (Zelle & Mooney, 1996; Thompson & Mooney, 1999, 2003) • Separates parser-learning and semantic-lexicon learning. • Learns a deterministic parser using ILP techniques. • COCKTAIL(Tang & Mooney, 2001) • Improved ILP algorithm for CHILL. • SILT (Kate, Wong & Mooney, 2005) • Learns symbolic transformation rules for mapping directly from NL to LF. • SCISSOR (Ge & Mooney, 2005) • Integrates semantic interpretation into Collins’ statistical syntactic parser. • WASP(Wong & Mooney, 2006) • Uses syntax-based statistical machine translation methods. • KRISP (Kate & Mooney, 2006) • Uses a series of SVM classifiers employing a string-kernel to iteratively build semantic representations.

  15. CHILL(Zelle & Mooney, 1992-96) • Semantic parser acquisition system using Inductive Logic Programming (ILP) to induce a parser written in Prolog. • Starts with a deterministic parsing “shell” written in Prolog and learns to control the operators of this parser to produce the given I/O pairs. • Requires a semantic lexicon, which for each word gives one or more possible meaning representations. • Parser must disambiguate words, introduce proper semantic representations for each, and then put them together in the right way to produce a proper representation of the sentence.

  16. CHILL Example • U.S. Geographical database • Sample training pair • Cuál es el capital del estado con la población más grande? • answer(C, (capital(S,C), largest(P, (state(S), population(S,P))))) • Sample semantic lexicon • cuál : answer(_,_) • capital: capital(_,_) • estado: state(_) • más grande: largest(_,_) • población: population(_,_)

  17. WOLFIE(Thompson & Mooney, 1995-1999) • Learns a semantic lexicon for CHILL from the same corpus of semantically annotated sentences. • Determines hypotheses for word meanings by finding largest isomorphic common subgraphs shared by meanings of sentences in which the word appears. • Uses a greedy-covering style algorithm to learn a small lexicon sufficient to allow compositional construction of the correct representation from the words in a sentence.

  18. WOLFIE Lexicon Learner Semantic Lexicon Meaning Rep CHILL Parser Learner Natural Language Semantic Parser WOLFIE + CHILLSemantic Parser Acquisition NLMR Training Exs

  19. Compositional Semantics Approach to semantic analysis based on building up an MR compositionally based on the syntactic structure of a sentence. Build MR recursively bottom-up from the parse tree. BuildMR(parse-tree) If parse-tree is a terminal node (word) then return an atomic lexical meaning for the word. Else For each child, subtreei, of parse-tree Create its MR by calling BuildMR(subtreei) Return an MR by properly combining the resulting MRs for its children into an MR for the overall parse-tree.

  20. Composing MRs from Parse Trees What is the capital of Ohio? S answer(capital(loc_2(stateid('ohio')))) VP NP capital(loc_2(stateid('ohio'))) answer() NP V capital(loc_2(stateid('ohio')))  WP answer() PP VBZ N DT capital()  loc_2(stateid('ohio'))  What answer() is NP capital IN the stateid('ohio') loc_2()   capital() NNP of stateid('ohio') loc_2() Ohio stateid('ohio')

  21. Disambiguation with Compositional Semantics • The composition function that combines the MRs of the children of a node, can return  if there is no sensible way to compose the children’s meanings. • Could compute all parse trees up-front and then compute semantics for each, eliminating any that ever generate a  semantics for any constituent. • More efficient method: • When filling (CKY) chart of syntactic phrases, also compute all possible compositional semantics of each phrase as it is constructed and make an entry for each. • If a given phrase only gives  semantics, then remove this phrase from the table, thereby eliminating any parse that includes this meaningless phrase.

  22. Composing MRs from Parse Trees What is the capital of Ohio? S VP NP NP V WP PP VBZ N DT  What is NP capital IN the riverid('ohio') loc_2() NNP of riverid('ohio') loc_2() Ohio riverid('ohio')

  23. Composing MRs from Parse Trees What is the capital of Ohio? S VP NP  NP V PP capital() loc_2(stateid('ohio'))  WP NP IN stateid('ohio') loc_2() VBZ N DT What capital()   NNP of stateid('ohio') is capital the loc_2()   capital() Ohio stateid('ohio')

  24. SCISSOR: Semantic Composition that Integrates Syntax and Semantics to get Optimal Representations

  25. S-bowner NP-player VP-bowner PRP$-team NN-player CD-unum VB-bowner NP-null our player 2 has DT-null NN-null the ball SCISSOR • An integrated syntax-based approach • Allows both syntax and semantics to be used simultaneously to build meaning representations • A statistical parser is used to generate a semantically augmented parse tree (SAPT) • Translate a SAPT into a complete formal meaning representation (MR) using a meaning composition process MR: bowner(player(our,2))

  26. require no arguments semantic vacuous require arguments Semantic Composition Example S-bowner(player(our,2)) NP-player(our,2) VP-bowner(_) NP-null PRP$-our NN-player(_,_) CD-2 VB-bowner(_) 2 DT-null NN-null our player has the ball player(team,unum) bowner(player)

  27. Semantic Composition Example S-bowner(player(our,2)) NP-player(our,2) VP-bowner(_) NP-null PRP$-our NN-player(_,_) CD-2 VB-bowner(_) 2 DT-null NN-null our player has the ball player(team,unum) bowner(player)

  28. Semantic Composition Example S-bowner(player(our,2)) NP-player(our,2) VP-bowner(_) NP-null PRP$-our NN-player(_,_) CD-2 VB-bowner(_) 2 DT-null NN-null our player has the ball player(team,unum) bowner(player)

  29. SCISSOR • An integrated syntax-based approach • Allows both syntax and semantics to be used simultaneously to build meaning representations • A statistical parser is used to generate a semantically augmented parse tree (SAPT) • Translate a SAPT into a complete formal meaning representation (MR) using a meaning composition process • Allow statistical modeling of semantic selectional constraints in application domains • (AGENTpass) = PLAYER

  30. NL Sentence learner SAPT Training Examples SAPT TRAINING TESTING ComposeMR MR Overview of SCISSOR Integrated Semantic Parser

  31. Extending Collins’ (1997) Syntactic Parsing Model • Collins’ (1997) introduced a lexicalized head-driven syntactic parsing model • Bikel’s (2004) provides an easily-extended open-source version of the Collins statistical parser • Extending the parsing model to generate semantic labels simultaneously with syntactic labels constrained by semantic constraints in application domains

  32. S-bowner(has) S(has) NP-player(player) VP-bowner(has) NP(player) VP(has) NP-null(ball) NP(ball) PRP$-team NN-player CD-unum VB-bowner NN-null DT-null PRP$ NN CD VB DT NN our player 2 has the ball our player 2 has the ball Integrating Semantics into the Model • Use the same Markov processes • Add a semantic labelto each node • Add semantic subcat frames • Give semantic subcategorization preferences • bowner takes a player as its argument

  33. Adding Semantic Labels into the Model S-bowner(has) VP-bowner(has) Ph(VP-bowner | S-bowner, has)

  34. Adding Semantic Labels into the Model S-bowner(has) VP-bowner(has) {NP}-{player} { }-{ } Ph(VP-bowner | S-bowner, has) × Plc({NP}-{player} | S-bowner, VP-bowner, has) × Prc({}-{}| S-bowner, VP-bowner, has)

  35. Adding Semantic Labels into the Model S-bowner(has) NP-player(player) VP-bowner(has) {NP}-{player} { }-{ } Ph(VP-bowner | S-bowner, has) × Plc({NP}-{player} | S-bowner, VP-bowner, has) × Prc({}-{}| S-bowner, VP-bowner, has) × Pd(NP-player(player) | S-bowner, VP-bowner, has, LEFT, {NP}-{player})

  36. Adding Semantic Labels into the Model S-bowner(has) NP-player(player) VP-bowner(has) { }-{ } { }-{ } Ph(VP-bowner | S-bowner, has) × Plc({NP}-{player} | S-bowner, VP-bowner, has) × Prc({}-{}| S-bowner, VP-bowner, has) × Pd(NP-player(player) | S-bowner, VP-bowner, has, LEFT, {NP}-{player})

  37. Adding Semantic Labels into the Model S-bowner(has) NP-player(player) VP-bowner(has) STOP { }-{ } { }-{ } Ph(VP-bowner | S-bowner, has) × Plc({NP}-{player} | S-bowner, VP-bowner, has) × Prc({}-{}| S-bowner, VP-bowner, has) × Pd(NP-player(player) | S-bowner, VP-bowner, has, LEFT, {NP}-{player}) × Pd(STOP | S-bowner, VP-bowner, has, LEFT, {}-{})

  38. Adding Semantic Labels into the Model S-bowner(has) NP-player(player) VP-bowner(has) STOP STOP { }-{ } { }-{ } Ph(VP-bowner | S-bowner, has) × Plc({NP}-{player} | S-bowner, VP-bowner, has) × Prc({}-{}| S-bowner, VP-bowner, has) × Pd(NP-player(player) | S-bowner, VP-bowner, has, LEFT, {NP}-{player}) × Pd(STOP | S-bowner, VP-bowner, has, LEFT, {}-{}) × Pd(STOP | S-bowner, VP-bowner, has, RIGHT, {}-{})

  39. SCISSOR Parser Implementation • Supervised training on annotated SAPTs is just frequency counting • Augmented smoothing technique is employed to account for additional data sparsity created by semantic labels. • Parsing of test sentences to find the most probable SAPT is performed using a variant of standard CKY chart-parsing algorithm.

  40. Smoothing • Each label in SAPT is the combination of a syntactic label and a semantic label • Increases data sparsity • Use Bayes rule to break the parameters down Ph(H | P, w) = Ph(Hsyn, Hsem | P, w) = Ph(Hsyn | P, w) × Ph(Hsem | P, w, Hsyn)

  41. Learning Semantic Parsers with a Formal Grammar for Meaning Representations • Our other techniques assume that meaning representation languages (MRLs) have deterministic context free grammars • True for almost all computer languages • MRs can be parsed unambiguously

  42. ANSWER RIVER answer TRAVERSE STATE traverse NEXT_TO STATE next_to STATEID ‘texas’ stateid NL: Which rivers run through the states bordering Texas? MR: answer(traverse(next_to(stateid(‘texas’)))) Parse tree of MR: Non-terminals: ANSWER, RIVER, TRAVERSE, STATE, NEXT_TO, STATEID Terminals: answer, traverse, next_to, stateid, ‘texas’ Productions: ANSWER  answer(RIVER), RIVER  TRAVERSE(STATE), STATE  NEXT_TO(STATE), TRAVERSE  traverse, NEXT_TO  next_to, STATEID  ‘texas’

  43. KRISP: Kernel-based Robust Interpretation for Semantic Parsing • Learns semantic parser from NL sentences paired with their respective MRs given MRL grammar • Productions of MRL are treated like semantic concepts • SVM classifier with string subsequence kernel is trained for each production to identify if an NL substring represents the semantic concept • These classifiers are used to compositionally build MRs of the sentences

  44. Overview of KRISP MRL Grammar Collect positive and negative examples NL sentences with MRs Best MRs (correct and incorrect) Train string-kernel-based SVM classifiers Training Semantic Parser Testing Novel NL sentences Best MRs

  45. Overview of KRISP MRL Grammar Collect positive and negative examples NL sentences with MRs Best MRs (correct and incorrect) Train string-kernel-based SVM classifiers Training Semantic Parser Testing Novel NL sentences Best MRs

  46. KRISP’s Semantic Parsing • We first define Semantic Derivation of an NL sentence • We next define Probability of a Semantic Derivation • Semantic parsing of an NL sentence involves finding its Most Probable Semantic Derivation • Straightforward to obtain MR from a semantic derivation

  47. ANSWER RIVER answer TRAVERSE STATE traverse NEXT_TO STATE next_to STATEID ‘texas’ stateid Semantic Derivation of an NL Sentence MR parse with non-terminals on the nodes: Which rivers run through the states bordering Texas?

  48. Semantic Derivation of an NL Sentence MR parse with productions on the nodes: ANSWER  answer(RIVER) RIVER  TRAVERSE(STATE) TRAVERSE  traverse STATE  NEXT_TO(STATE) NEXT_TO  next_to STATE  STATEID STATEID  ‘texas’ Which rivers run through the states bordering Texas?

  49. Semantic Derivation of an NL Sentence Semantic Derivation: Each node coversan NL substring: ANSWER  answer(RIVER) RIVER  TRAVERSE(STATE) TRAVERSE  traverse STATE  NEXT_TO(STATE) NEXT_TO  next_to STATE  STATEID STATEID  ‘texas’ Which rivers run through the states bordering Texas?

  50. Semantic Derivation of an NL Sentence Semantic Derivation: Each node contains a production and the substring of NL sentence it covers: (ANSWER  answer(RIVER), [1..9]) (RIVER  TRAVERSE(STATE),[1..9]) (TRAVERSE  traverse,[1..4]) (STATE  NEXT_TO(STATE),[5..9]) (NEXT_TO  next_to, [5..7]) (STATE  STATEID,[8..9]) (STATEID  ‘texas’,[8..9]) Which rivers run through the states bordering Texas? 1 2 3 4 5 6 7 8 9

More Related