1 / 15

Artificial Intelligence: Natural Language

Artificial Intelligence: Natural Language. A little more on grammars Semantics Pragmatics Generation. More on grammars. Consider following examples: “John likes.” NOT OK “John jumps.” OK “John jumps in the water,” OK “The small fluffy cat jumps.” OK John like the cat. NOT OK.

plafleur
Télécharger la présentation

Artificial Intelligence: Natural Language

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Intelligence: Natural Language A little more on grammars Semantics Pragmatics Generation

  2. More on grammars • Consider following examples: • “John likes.” NOT OK • “John jumps.” OK • “John jumps in the water,” OK • “The small fluffy cat jumps.” OK • John like the cat. NOT OK. • The cats likes John. NOT OK. • The cat on the table likes John. OK

  3. Better grammar • Should deal with: • Intransive/Transitive verbs. Former are ones that don’t need following noun phrase. • Prepositional phrases (e.g., in the lake). Prepostion followed by noun phrase. • Series of adjectives. Recursive rule can be used.. • Subject-verb agreement. Can add arguments to grammar rules/dictionary entries. • sentence --> np(Num), vp(Num). • np(Num) --> art, noun(Num). • noun(sing) --> [cat].

  4. Semantics • Syntax: Uses grammar to structure sentence. • Semantics: Maps this to a structured representation that can be used in inference. (often referred to as sentence meaning) • Possible representations: • SQL. Map “Find me all the students who are taking AI3” to relevant SQL query. • Predicate Logic: Map “John loves anyone who is tall” onto relevant statement in predicate logic. • Other structured rep: (e.g., “case frame”: action: loves subject: john object: mary

  5. Semantics • How do we get from the parsed sentence to this kind of representation? • In general rather tricky, but to illustrate idea we will show how it could be done for “John loves Mary” by adding extra arguments to a prolog grammar. • We want to map that sentence to • loves(john, mary). • We will cheat by assuming that the functor pf Prolog structured objects can be a variable. • Verb(Object, Subject)

  6. Grammar with Semantics Sentence(Verb(Subject, Object)) --> nounPhrase(Subject), verbPhrase(Verb, Object). nounPhrase(Subject) --> properName(Subject). verbPhrase(Verb, Object) --> verb(Verb), nounPhrase(Object). • General idea is that we can “compose” the sentence meaning by working out the “meaning” of the syntactic constituents and sticking the results together somehow.

  7. Pragmatics • But can’t get very far without knowing something about the world, and the context in which a sentence is uttered. • Pragmatics deals with this. • Example. Determining referents of pronouns etc. • “John likes that blue car. He buys it.” • We need context to determine what he is referring to in “that blue car”, “he”, it”. • Then can create meaning: likes(john, car1) and buys(john, car1).

  8. Pragmatics • Pragmatics is also about what people DO with language. • Making sense of, and generating language involves mapping language to goals. • “Do you have the time?” -> speaker wants to know the time. • “When is the last train to London?” -> speaker probably wants to go there. • We can apply some of our planning ideas to this problem.

  9. Pragmatics and Plans • As an example of a plan-based approach to language, consider the actions of requesting, informing, asking. • Referred to as “speech acts”. • We can describe these as planning operators. • The preconditions and effects refer to speaker and hearer’s beliefs and desires. • We use a notation to describe these: • knows(Agent, Fact) • wants(Agent, State/Action) • e.g., wants(fred, kiss(fred, mary)) • knows(fred, loves(mary, joe))

  10. More speech acts • Sketch of inform, request, • inform(Speaker, Hearer, Fact) pre: knows(Speaker, Fact) wants(Speaker, knows(Hearer, Fact)) add: knows(Hearer, Fact) knows(Speaker, knows(Hearer, Fact)) • How does this oversimplify the “informing” action? • request(Speaker, Hearer, do(Hearer, Action)) pre: wants(Speaker, Action) knows(Speaker, cando(Hearer, Action)) add: wants(hearer, Action) • (Note: A bit tricky to integrate with ordinary planning rules.) • We talk of people having “communicative goals” (like wanting someone to know something)

  11. Putting it all together • Given sentences like spoken by John about Fred: • “What is the time? • He has missed the train. • Can now • parse the sentence • map that to a structured representation that is good for inference. • Use context and knowledge of goals/plans to obtain from that: • wants(john, know(john, time1)) (where time1 is the time at some instant) • believes(john, missed(fred, train2))

  12. Language Generation • Language processing also about generation of language. • Structured representation --> NL text. • Simplest generation method is using templates, mapping representation straight to text template (with variables/slots to fill in). • loves(X, Y) -> X “loves” Y • gives(X, Y, Z) -> X “gives the” Y “to” Z • Mail-merge tools in word processors work similarly, extracting data from simple database to fill slots.

  13. Language Generation • But much more to language generation in general. Templates are very rigid. • Consider “John eats the cheese. John eats the apple. John sneezes. John laughs.” • Better as “John eats the cheese and apple, then sneezes. He then laughs.” • Getting good style involves working out how to map many facts to one sentence, when to use pronouns, when to use “connectives” like “then”.

  14. Language Generation • Serious language generation involves deciding: • what to say. • how to order and structure it. • How to break it up into sentences. • How to refer to objects (using pronouns, and expressions like “the cat” etc). • How to express things in terms of grammatically correct sentences. • Often starting point is a communicative goal

  15. Summary • Natural Language Processing includes: • Syntax • Semantics • Pragmatics • And involves: • Generating language • Understanding language

More Related