1 / 110

Representing Meaning

Representing Meaning. Lecture 19 13 Sep 2007. Semantic Analysis. Semantic analysis is the process of taking in some linguistic input and assigning a meaning representation to it. There a lot of different ways to do this that make more or less (or no) use of syntax

Télécharger la présentation

Representing Meaning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Representing Meaning Lecture 19 13 Sep 2007

  2. Semantic Analysis • Semantic analysis is the process of taking in some linguistic input and assigning a meaning representation to it. • There a lot of different ways to do this that make more or less (or no) use of syntax • We’re going to start with the idea that syntax does matter • The compositional rule-to-rule approach • Compositional Semantics • Syntax-driven methods of assigning semantics to sentences

  3. Semantic Processing • We’re going to discuss 2 ways to attack this problem (just as we did with parsing) • There’s the theoretically motivated correct and complete approach… • Computational/Compositional SemanticsCreate a FOL representation that accounts for all the entities, roles and relations present in a sentence. • And there are practical approaches that have some hope of being useful and successful. • Information extractionDo a superficial analysis that pulls out only the entities, relations and roles that are of interest to the consuming application.

  4. Compositional Analysis • Principle of Compositionality • The meaning of a whole is derived from the meanings of the parts • What parts? • The constituents of the syntactic parse of the input • What could it mean for a part to have a meaning?

  5. $ x , y Giving ( x )^ Giver ( John , x )^ Given ( y , x ) ^ Givee ( Mary , x )^ Isa ( y , Book ) Better • Turns out this representation isn’t quite as useful as it could be. • Giving(John, Mary, Book) • Better would be one where the “roles” or “cases” are separated out. E.g., consider: • Note: essentially Giver=Agent, Given=Theme, Givee=To-Poss

  6. Predicates • The notion of a predicate just got more complicated… • In this example, think of the verb/VP providing a template like the following • The semantics of the NPs and the PPs in the sentence plug into the slots provided in the template

  7. Advantages • Can have variable number of arguments associated with an event: events have many roles and fillers can be glued on as appear in the input. • Specifies categories (e.g., book) so that we can make assertions about categories themselves as well as their instances. E.g., Isa(MobyDick, Novel), AKO(Novel, Book). • Reifies events so that they can be quantified and related to other events and objects via sets of defined relations. • Can see logical connections between closely related examples without the need for meaning postulates.

  8. Example • AyCaramba serves meat

  9. Compositional Analysis

  10. Augmented Rules • We’ll accomplish this by attaching semantic formation rules to our syntactic CFG rules • Abstractly • This should be read as the semantics we attach to A can be computed from some function applied to the semantics of A’s parts.

  11. Easy parts… NP -> PropNoun NP -> MassNoun PropNoun -> AyCaramba MassMoun -> meat Attachments {PropNoun.sem} {MassNoun.sem} {AyCaramba} {MEAT} Example

  12. S -> NP VP VP -> Verb NP Verb -> serves {VP.sem(NP.sem)} {Verb.sem(NP.sem) ??? Example

  13. Lambda Forms • A simple addition to FOPC • Take a FOPC sentence with variables in it that are to be bound. • Allow those variables to be bound by treating the lambda form as a function with formal arguments

  14. Example

  15. Example

  16. Example

  17. Example

  18. Syntax/Semantics Interface: Two Philosophies • Let the syntax do what syntax does well and don’t expect it to know much about meaning • In this approach, the lexical entry’s semantic attachments do all the work • Assume the syntax does know something about meaning • Here the grammar gets complicated and the lexicon simpler (constructional approach)

  19. Example • Mary freebled John the nim. • Who has it? • Where did he get it from? • Why?

  20. Example • Consider the attachments for the VPs VP -> Verb NP NP rule (gave Mary a book) VP -> Verb NP PP (gave a book to Mary) Assume the meaning representations should be the same for both. Under the lexicon-heavy scheme, the VP attachments are: VP.Sem (NP.Sem, NP.Sem) VP.Sem (NP.Sem, PP.Sem)

  21. Example • Under a syntax-heavy scheme we might want to do something like • VP -> V NP NP V.sem ^ Recip(NP1.sem) ^ Object(NP2.sem) • VP -> V NP PP V.Sem ^ Recip(PP.Sem) ^ Object(NP1.sem) • i.e the verb only contributes the predicate, the grammar “knows” the roles.

  22. Integration • Two basic approaches • Integrate semantic analysis into the parser (assign meaning representations as constituents are completed) • Pipeline… assign meaning representations to complete trees only after they’re completed

  23. Semantic Augmentation to CFG Rules • CFG Rules are attached with semantic attachments. • These semantic attachments specify how to compute the meaning representation of a construction from the meanings of its constituent parts. • A CFG rule with semantic attachment will be as follows: A  1,…,n { f(j.sem,…,k.sem) } • The meaning representation of A, A.sem, will be calculated by applying the function f to the semantic representations of some constituents.

  24. Naïve Approach ProperNoun  Anarkali {Anarkali } MassNoun  meat { Meat } NP  ProperNoun {ProperNoun.sem } NP  MassNoun { MassNoun.sem } Verb  serves {e,x,y ISA(e,Serving)  Server(e,x)  Served(e,y) } • But we cannot propagate this representation to upper levels.

  25. Using Lambda Notations ProperNoun  Anarkali { Anarkali } MassNoun  meat { Meat } NP  ProperNoun {ProperNoun.sem } NP  MassNoun { MassNoun.sem } Verb  serves {xy e ISA(e,Serving)  Server(e,y)  Served(e,x) } VP  Verb NP { Verb.sem(NP.sem) } S  NP VP { VP.sem(NP.sem) } application of lambda expressionlambda expression

  26. Quasi-Logical Form • During semantic analysis, we may use quantified expressions as terms. In this case, our formula will not be a FOPC formula. We call this form of formulas as quasi-logical form. • A quasi-logical form should be converted into a normal FOPC formula by applying simple syntactic translations. Server(e,<x ISA(x,Restaurant)>) a quasi-logical formula  x ISA(x,Restaurant )  Server(e,x) a normal FOPC formula

  27. S write(bertrand,principia) VP y.write(y,principia) NP bertand bertrand V x.y.write(y,x) NP principia writes principia Parse Tree with Logical Forms

  28. Pros and Cons • If you integrate semantic analysis into the parser as it is running… • You can use semantic constraints to cut off parses that make no sense • But you assign meaning representations to constituents that don’t take part in the correct (most probable) parse

  29. Complex Terms • Allow the compositional system to pass around representations like the following as objects with parts: Complex-Term →<Quantifier var body>

  30. Example • Our restaurant example winds up looking like • Big improvement…

  31. Conversion • So… complex terms wind up being embedded inside predicates. So pull them out and redistribute the parts in the right way… P(<quantifier, var, body>) turns into Quantifier var body connective P(var)

  32. Example

  33. Quantifiers and Connectives • If the quantifier is an existential, then the connective is an ^ (and) • If the quantifier is a universal, then the connective is an ->(implies)

  34. Multiple Complex Terms • Note that the conversion technique pulls the quantifiers out to the front of the logical form… • That leads to ambiguity if there’s more than one complex term in a sentence.

  35. Quantifier Ambiguity • Consider • Every restaurant has a menu • That could mean that every restaurant has a menu • Or that There’s some uber-menu out there and all restaurants have that menu

  36. Quantifier Scope Ambiguity

  37. Ambiguity • This turns out to be a lot like the prepositional phrase attachment problem • The number of possible interpretations goes up exponentially with the number of complex terms in the sentence • The best we can do is to come up with weak methods to prefer one interpretation over another

  38. Non-Compositionality • Unfortunately, there are lots of examples where the meaning (loosely defined) can’t be derived from the meanings of the parts • Idioms, jokes, irony, sarcasm, metaphor, metonymy, indirect requests, etc

  39. English Idioms • Kick the bucket, buy the farm, bite the bullet, run the show, bury the hatchet, etc… • Lots of these… constructions where the meaning of the whole is either • Totally unrelated to the meanings of the parts (kick the bucket) • Related in some opaque way (run the show)

  40. The Tip of the Iceberg • Describe this construction • A fixed phrase with a particular meaning • A syntactically and lexically flexible phrase with a particular meaning • A syntactically and lexically flexible phrase with a partially compositional meaning • …

  41. Example • Enron is the tip of the iceberg. NP -> “the tip of the iceberg” • Not so good… attested examples… • the tip of Mrs. Ford’s iceberg • the tip of a 1000-page iceberg • the merest tip of the iceberg • How about • That’s just the iceberg’s tip.

  42. Example • What we seem to need is something like • NP -> An initial NP with tip as its head followed by a subsequent PP with of as its head and that has iceberg as the head of its NP And that allows modifiers like merest, Mrs. Ford, and 1000-page to modify the relevant semantic forms

  43. Quantified Phrases • Consider A restaurant serves meat. • Assume that A restaurant looks like • If we do the normal lambda thing we get

  44. Semantic analysis • Goal: to form the formal structures from smaller pieces • Three approaches: • Syntax-driven semantic analysis • Semantic grammar • Information extraction: filling templates

  45. Semantic grammar • Syntactic parse trees only contain parts that are unimportant in semantic processing. • Ex: Mary wants to go to eat some Italian food • Rules in a semantic grammar • InfoRequest USER want to go to eat FOODTYPE • FOODTYPENATIONALITY FOODTYPE • NATIONALITYItalian/Mexican/….

  46. Semantic grammar (cont) Pros: • No need for syntactic parsing • Focus on relevant info • Semantic grammar helps to disambiguate Cons: • The grammar is domain-specific.

  47. Information extraction • The desired knowledge can be described by a relatively simple and fixed template. • Only a small part of the info in the text is relevant for filling the template. • No full parsing is needed: chunking, NE tagging, pattern matching, … • IE is a big field: e.g., MUC. KnowItAll

  48. Summary of semantic analysis • Goal: to form the formal structures from smaller pieces • Three approaches: • Syntax-driven semantic analysis • Semantic grammar • Information extraction

  49. Lexical Semantics

  50. Meaning • Traditionally, meaning in language has been studied from three perspectives • The meaning of a text or discourse • The meanings of individual sentences or utterances • The meanings of individual words • We started in the middle, now we’ll look at the meanings of individual words.

More Related