1 / 51

Knowledge Representation and Reasoning

Knowledge Representation and Reasoning. Stuart C. Shapiro Professor, CSE Director, SNePS Research Group Member, Center for Cognitive Science. Introduction. Long-Term Goal. Theory and Implementation of Natural-Language-Competent Computerized Cognitive Agent and Supporting Research in

silver
Télécharger la présentation

Knowledge Representation and Reasoning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Knowledge Representation and Reasoning Stuart C. Shapiro Professor, CSE Director, SNePS Research Group Member, Center for Cognitive Science S.C. Shapiro

  2. Introduction S.C. Shapiro

  3. Long-Term Goal • Theory and Implementation of Natural-Language-Competent Computerized Cognitive Agent • and Supporting Research in Artificial Intelligence Cognitive Science Computational Linguistics. S.C. Shapiro

  4. Research Areas • Knowledge Representation and Reasoning • Cognitive Robotics • Natural-Language Understanding • Natural-Language Generation. S.C. Shapiro

  5. Goal • A computational cognitive agent that can: • Understand and communicate in English; • Discuss specific, generic, and “rule-like” information; • Reason; • Discuss acts and plans; • Sense; • Act; • Remember and report what it has sensed and done. S.C. Shapiro

  6. Cassie • A computational cognitive agent • Embodied in hardware • or Software-Simulated • Based on SNePS and GLAIR. S.C. Shapiro

  7. GLAIR Architecture Grounded Layered Architecture with Integrated Reasoning Knowledge Level SNePS Perceptuo-Motor Level NL Sensory-Actuator Level Vision Sonar Proprioception Motion S.C. Shapiro

  8. SNePS • Knowledge Representation and Reasoning • Propositions as Terms • SNIP: SNePS Inference Package • Specialized connectives and quantifiers • SNeBR: SNePS Belief Revision • SNeRE: SNePS Rational Engine • Interface Languages • SNePSUL: Lisp-Like • SNePSLOG: Logic-Like • GATN for Fragments of English. S.C. Shapiro

  9. Example Cassies& Worlds S.C. Shapiro

  10. BlocksWorld S.C. Shapiro

  11. FEVAHR S.C. Shapiro

  12. FEVAHRWorld Simulation S.C. Shapiro

  13. UXO Remediation Corner flag Field Drop-off zone UXO NonUXO object Battery meter Corner flag Corner flag Recharging Station Cassie Safe zone S.C. Shapiro

  14. Crystal Space Environment S.C. Shapiro

  15. Sample Research Issues:Complex Categories S.C. Shapiro

  16. Complex Categories 1 • Noun Phrases: <Det> {N | Adj}* N Understanding of the modification must be left to reasoning. Example: orange juice seat Representation must be left vague. S.C. Shapiro

  17. Complex Categories 2 : Kevin went to the orange juice seat. I understand that Kevin went to the orange juice seat. : Did Kevin go to a seat? Yes, Kevin went to the orange juice seat. S.C. Shapiro

  18. Complex Categories 3 : Pat is an excellent teacher. I understand that Pat is an excellent teacher. : Is Pat a teacher? Yes, Pat is a teacher. : Lucy is a former teacher. I understand that Lucy is a former teacher. S.C. Shapiro

  19. Complex Categories 4 : `former' is a negative adjective. I understand that `former' is a negative adjective. : Is Lucy a teacher? No, Lucy is not a teacher. Also note representation and use of knowledge about words. S.C. Shapiro

  20. Sample Research Issues:Indexicals S.C. Shapiro

  21. Representation and Use of Indexicals • Words whose meanings are determined by occasion of use • E.g. I, you, now, then, here, there • Deictic Center <*I, *YOU, *NOW> • *I: SNePS term representing Cassie • *YOU: person Cassie is talking with • *NOW: current time. S.C. Shapiro

  22. Analysis of Indexicals(in input) • First person pronouns: *YOU • Second person pronouns: *I • “here”: location of *YOU • Present/Past relative to *NOW. S.C. Shapiro

  23. Generation of Indexicals • *I: First person pronouns • *YOU: Second person pronouns • *NOW: used to determine tense and aspect. S.C. Shapiro

  24. Use of Indexicals 1 Come here. S.C. Shapiro

  25. Use of Indexicals 2 Come here. I came to you, Stu. I am near you. S.C. Shapiro

  26. Use of Indexicals 3 Whoam I? Your name is ‘Stu’ and you are a person. Whohaveyoutalkedto? I am talking to you. TalktoBill. I am talking to you, Bill. Comehere. S.C. Shapiro

  27. Use of Indexicals 4 Comehere. I found you. I am looking at you. S.C. Shapiro

  28. Use of Indexicals 5 Comehere. I found you. I am looking at you. I came to you. I am near you. S.C. Shapiro

  29. Use of Indexicals 6 WhoamI? Your name is ‘Bill’ and you are a person. Whoareyou? I am the FEVAHR and my name is ‘Cassie’. Whohaveyoutalkedto? I talked to Stu and I am talking to you. S.C. Shapiro

  30. Current Research Issues: Distinguishing Perceptually Indistinguishable ObjectsPh.D. Dissertation, John F. Santore S.C. Shapiro

  31. Some robots in a suite of rooms. S.C. Shapiro

  32. Are these the same two robots? • Why do you think so/not? S.C. Shapiro

  33. Next Steps • How do people do this? • Currently doing protocol experiments • Getting Cassie to do it. S.C. Shapiro

  34. Current Research Issues: Belief Revisionin aDeductively Open Belief SpacePh.D. Dissertation, Frances L. Johnson S.C. Shapiro

  35. Belief Revision in a Deductively Open Belief Space • Beliefs in a knowledge base must be able to be changed (belief revision) • Add & remove beliefs • Detect and correct errors/conflicts/inconsistencies • BUT … • Guaranteeing consistency is an ideal concept • Real world systems are not ideal S.C. Shapiro

  36. Belief Revision in a DOBS Ideal Theories vs. Real World • Ideal Belief Revision theories assume: • No reasoning limits (time or storage) • All derivable beliefs are acquirable (deductive closure) • All belief credibilities are known and fixed • Real world • Reasoning takes time, storage space is finite • Some implicit beliefs might be currently inaccessible • Source/belief credibilities can change S.C. Shapiro

  37. Belief Revision in a DOBS A Real World KR System • Must recognize its limitations • Some knowledge remains implicit • Inconsistencies might be missed • A source turns out to be unreliable • Revision choices might be poor in hindsight • After further deduction or knowledge acquisition • Must repair itself • Catch and correct poor revision choices S.C. Shapiro

  38. Belief Revision in a DOBS Theory Example – Reconsideration Ranking 1 is more credible that Ranking 2. Ranking 1 is more credible that Ranking 2. College A is better than College B. (Source: Ranking 1) College B is better than College A. (Source: Ranking 2) College B is better than College A. (Source: Ranking 2) Ranking 1 was flawed, so Ranking 2 is more credible than Ranking 1. Need to reconsider! S.C. Shapiro

  39. Next Steps • Implement reconsideration • Develop benchmarks for implemented krr systems. S.C. Shapiro

  40. Current Research Issues: Default ReasoningbyPreferential Ordering of BeliefsM.S. Thesis, Bharat Bhushan S.C. Shapiro

  41. Small Knowledge Base • Birds have wings. • Birds fly. • Penguins are birds. • Penguins don’t fly. S.C. Shapiro

  42. Bird(x): Flies(x) • Flies(x) KB Using Default Logic • x(Bird(x)  Has(x, wings)) • x(Penguin(x)  Bird(x)) • x(Penguin(x) Flies(x)) S.C. Shapiro

  43. KB Using Preferential Ordering • x(Bird(x)  Has(x, wings)) • x(Bird(x)  Flies(x)) • x(Penguin(x)  Bird(x)) • x(Penguin(x) Flies(x)) • Precludes(x(Penguin(x) Flies(x)), • x(Bird(x)  Flies(x))) S.C. Shapiro

  44. Next Steps • Finish theory and implementation. S.C. Shapiro

  45. Current Research Issues: Representation & Reasoningwith Arbitrary ObjectsStuart C. Shapiro S.C. Shapiro

  46. Classical Representation • Clyde is gray. • Gray(Clyde) • All elephants are gray. • x(Elephant(x)  Gray(x)) • Some elephants are albino. • x(Elephant(x) & Albino(x)) • Why the difference? S.C. Shapiro

  47. Representation Using Arbitrary & Indefinite Objects • Clyde is gray. • Gray(Clyde) • Elephants are gray. • Gray(any x Elephant(x)) • Some elephants are albino. • Albino(some x Elephant(x)) S.C. Shapiro

  48. Subsumption Among Arbitrary & Indefinite Objects (any x Elephant(x)) (any x Albino(x) & Elephant(x)) (some x Albino(x) & Elephant(x)) (some x Elephant(x)) If x subsumes y, then P(x)  P(y) S.C. Shapiro

  49. Example (Runs in SNePS 3) Hungry(any x Elephant(x) & Eats(x, any y Tall(y) & Grass(y) & On(y, Savanna)))  Hungry(any u Albino(u) & Elephant(u) & Eats(u, any v Grass(v) & On(v, Savanna))) S.C. Shapiro

  50. Next Steps • Finish theory and implementation of arbitrary and indefinite objects. • Extend to other generalized quantifiers • Such as most, many, few, no, both, 3 of, … S.C. Shapiro

More Related